Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

Samsung Ships the First Blu-Ray Player 255

DigitalDame2 writes "PCMag.com reports that beginning June 25th, consumers will be able to purchase the first Blu-Ray player: the Samsung BD-P1000. The BD-P1000 is twice the price of the HD-A1 ($999.99 list), but supports full 1080p playback, something the first generation of HD-DVD players do not. It also up-converts conventional DVDs to 1080p to improve video quality and comes with HDMI, Component, S-video, and composite outputs. The BD-P1000 will be sold at more than 200 retail locations, including Best Buy, Tweeter, and Circuit City, and 10 Blu-Ray titles will be available as well."
This discussion has been archived. No new comments can be posted.

Samsung Ships the First Blu-Ray Player

Comments Filter:
  • by ackthpt ( 218170 ) * on Thursday June 15, 2006 @07:18PM (#15544907) Homepage Journal

    Gosh! Only $999.99 list (or as we learned from The Price Is Right, the price you ask if you never plan to actually sell any, except to the most gullible or desperate, actual price will probably be about $700) I can wait.

    When VCRs came out I bought a rather nice one for ~900$US. When CD's came out I bought a nifty CD player for about 700$US. I was a little more patient with DVDs but eventually got a DVD drive for a home computer and then a portable player (computer ~70$US, Portabl ~1000$US) As I'm pretty well past the point of being impressed with Eye Candy in cinema, I'll probably only get a Blu-Ray when there's significant offerings and most of the newer films I must have are only available via that channel.

    • Last time I was in Shanghai (september 2005), I found a DVD player for 85 Yuan. Do the conversion, and that's about 10 US dollars!!! That's CHEAPER then some no-name movie. As for the player itself, it was equivilent to an Apex Digital Inc brand. Not bad at all. Not feature rich either...

      It really is a supply/demand issue of course. If BlueRay or HD-DVD becomes a pupular format, expect to see these players less that 100 bucks next year. Chinese brand of course.
    • Just add some prices for reference:

      A friend of mine bought a Beta II machine in 1980 for ~2000$US. Complete with a wired remote for pausing video (for skipping commercials)
      When it looked like VHS was winning the format war around 1983/4, I bought my JVC VHS for ~500$US and that had a wireless (IR) remote and programmable timer that you had to tune one of the 13 available tuners to.
      My first CD player, 1985 ~150$US.

      I paid ~200$US for a HiFi VCR from Samsung in 1992.
      First DVD player for computer was ~110$US a
    • Anyone who buys this first Blu-Ray player will confirm the theory that a sucker is born every minute. Assuming Blu-Ray is even around a year from now, players will cost maybe $300-$400. Remember how fast DVD players came down in price? This is probably just to make money from the 10 or so people who are stupid enough to pay a grand just to have bragging rights, and then the price promptly drops by a couple hundred $$.
  • No! (Score:3, Funny)

    by rbarreira ( 836272 ) on Thursday June 15, 2006 @07:19PM (#15544914) Homepage
    No, the PS3 had it first! Oh wait...
  • by Anonymous Coward
    how on earth are they getting away with making dvd's look better by 'upscaling' them?

    are they using the "Zoom" "enhance" method that we've seen on movies for so long... or are they recreating information which did not exist on the dvd using some crazy AI?

    those kids at samsung, what will they think of next!

    • I have a SamSung upscaling DVD player with HDMI output. Quite honestly, any flat panel TV is going to already do this to display a 480p image on a 720p/1080 display, so there is no difference whether the scaler is enabled on the player or not.

      This might do something if you have an old CRT hdtv, but I really don't see how there could be much improvement. It's kind of like taking a 128k mp3 file, opening it in some audio tools, then saving it at 256k. You can't ever get back the missing bits.
      • You're right, you can't do that. However, you CAN take a jpeg which has lost some data, apply some intelligent filters to it and clean out the blockiness / random and color shifts in what should have been a solid color, and the interesting aliasing artifacts around what used to be straight lines, and smooth those over (for the color abruptness) and straighten out / dealias the lines (I don't want to say antialias).

        It's not impossible, just difficult. Compare the ATI TV Wonder Elite (it's not the only one,
        • Disclaimer: I actually have no idea what the hell i'm talking about. I just read that review recently and was amazed, so I'm trying to sound like I know what I'm talking about.

          By any chance, have you stayed at a Holiday Express anytime lately?
      • By burning through some CPU time, I've got no problem doing things to make videos look arguably better than the original. It can be hit-or-miss sometimes, but applying a light sharpen and denoise filter, with a decent deinterlacing and then a smart upscaling can actually produce some decent results.

        I would make the argument that the Creative X-Fi is effectively supposed to do that for MP3s, but not having used or heard one I can't say. In the end it's all intelligent guesswork being done by the decoder,

        • On a semi-related note, I was browing the HD-DVD section today just for kicks, and every movie I looked at said it was 1080p. Is it safe to assume that there's just a first-gen player limitation, rather than everyone just assuming that we can't fit 1080p stuff on to 15GB (or 30GB, if DL)?

          1080p moves can easily fit on HD-DVDs (even with extras included, though nearly all HD-DVD extras so far are the old 480i/p ones found on DVDs), and that's what most of the studios are doing. I suspect that the limitatio

          • So, in that case, does blu-ray have anything going for it? Higher price, more restrictive, preinstalled rootkit, effectively unreleased, and no advantages with picture quality. Are they relying solely on the fact that the PS3 will have a blu-ray drive installed? Have they completely failed to realize that DVDs were successful because they had no competition, rather than because they had a large console launch that supported the media (and, may I add, supported it quite shittily)?

            As an added bonus, some

            • So, in that case, does blu-ray have anything going for it?

              Well, Blu-Ray does have something going for it: It has a size advantage of 10 GB (25GB versus 15GB) per layer over HD-DVD. The thing is, this advantage is deprecated somewhat in both the movie and gaming environments by the fact that 15GB/30GB (single-player/dual-layer) is typically more than enough space for this kind of pre-recorded material, as HD-DVD is proving by putting 1080p movies on their discs. Admittedly, this can be argued a bit by fo

        • by Teddy Beartuzzi ( 727169 ) on Thursday June 15, 2006 @11:37PM (#15546265) Journal
          On a semi-related note, I was browing the HD-DVD section today just for kicks, and every movie I looked at said it was 1080p. Is it safe to assume that there's just a first-gen player limitation, rather than everyone just assuming that we can't fit 1080p stuff on to 15GB (or 30GB, if DL)?

          The whole "but ìt's only 1080i" is a total red-herring. From the dvdtalk review [dvdtalk.com]:

          "In the last couple of days, several technical issues have been put to rest, at least for me. The first was the common accusation that the initial HD DVD players like the Toshiba HD-A1 are deficient because they don't output "full 1080p" resolution, that they are "1080i only." I don't see this as a practical concern. All HD DVD and Blu-ray discs will encode film-sourced material in full 1920x1080 progressive scan resolution at 24 frames per second, which is the film industry standard.

          Unfortunately many folks are confusing 1080i acquisition with 1080i transmission. The primary reason we get interlacing artifacts in a 480i, 576i, or 1080i signal is that the frame was originally captured in interlaced format, with the odd scan lines and even scan lines being recorded at two different moments in time. When you reassemble two fields that are offset in time, you get jaggies, moire patterns, barber pole effects, and line twitter. That is not true of either HD DVD or Blu-ray film transfers since the image is scanned progressively from a film frame that represents a single moment in time.

          Therefore we would expect to see none of the common evidence of deinterlacing when watching HD DVD or Blu-ray movies that are being transmitted via 1080i. Our first look at HD DVD in 1080i confirms this expectation. After hours of viewing three different HD DVD movies there is simply no evidence of any artifact that might be attributed to the fact that the signal was transmitted in 1080i format. The picture is as clean, stable, and as artifact-free as it could be. There is no visible defect in the image that would be eliminated by switching to 1080p transmission."

          Make your decision on HD-DVD vs Blu Ray, but don't do it based on this bogus 1080i issue.

    • by voxel ( 70407 ) on Thursday June 15, 2006 @09:48PM (#15545745)
      I don't know if they do it on this player or not, but there is some easy-setup software you can do on your PC to check it out for yourself.

      Use ffdshow (google for it). It is a DirectX filter (correct me if im wrong), in which youc an apply many effects to an image.

      The trick is to scale the DVD 720x480 up to 1080p (or whatever you want) then apply a LANCOZ sharpening filter on ONLY the luma channel. *NOTE: I think I got that right, lancoz on luma channel, its been a while forgive me if im spelling something wrong.

      There are actually lots of articles on the net (again google), that talk about this technique.

      So I tried it for myself. Low and behold, the image really DOES look better. It amazingly adds "perceived" detail.

      The trick again is sharpening only one channel in the image (luma/chroma/something else... (im no expert)).

    • The real news is that the first bluray player has been released. Upscaling DVD players have been available for a long, long time. But if you must always look upon the old with fresh eyes, here's an overview of one upscaler, Faroudja's DCDI [gnss.com]
  • Wow... (Score:4, Insightful)

    by martinultima ( 832468 ) <martinultima@gmail.com> on Thursday June 15, 2006 @07:20PM (#15544919) Homepage Journal
    A thousand bucks, and there will only be ten titles when it first comes out? Now I can see why only obsessive early adopters would want something like this – quite honestly, I just don't see the point of getting a $1000 device that can only play 10 titles (no matter how high-definition the titles and/or the point may be).
    • Re:Wow... (Score:3, Funny)

      by Umbral Blot ( 737704 )
      I think the point is just so it can be out there, to help solidify the standard and raise consumer awareness. That way when titles do start being released in Blu-Ray you will at least know that they can be played, versus the other marketing strategy, where they realease a bunch of movies in Blu-Ray, but no player, which would be stupid.
    • How many titles are there for HD-DVD?

      If its alot greater than 10 I think we may have a winner for the next standard.
    • I just don't see the point of getting a $1000 device that can only play 10 titles (no matter how high-definition the titles and/or the point may be).

      If it can play recorded BD-Roms I can easily see 5000 units being sold to advertising agencies for demo purposes. $1800 (burner plus player) is pretty cheap if it can help you maintain your contract list -- maintain only since your competitors had the same thought.
    • $1K is nothing. Early HDTV setups set their early adopters back ten times as much.

      Also, remember that you don't buy expensive toys just to use them. You also buy them to shame all your friends and neighbors whose toys aren't as expensive as yours.

      • Bragging rights is a small part of it. People that buy into the first wave of technology are of two mindsets. Either they love/lust technology or it's for some self-gratification of being a part of the "elite in-crowd".

        You check into the home audio/theater scene. The people are very eccentric and snobbish! It's almost a religious way of life to these people.
        • Re:Wow... (Score:3, Insightful)

          There are also a lot of people to whom 1k really isn't any signifigant ammount of money. I don't hapen to be one of those people, but I know a few :)
        • Some people really enjoy the idea of watching films in a theater (big screen, loud sounds-- up to 115 dB for the LFE, and superb picture and sound quality) but they can't stand the reality (overpriced popcorn, sticky floors, cell phones, hiring a babysitter). So they build their own theaters. What's wrong with that?

          Besides, LOTR doesn't play locally, at least anymore.
          • There's nothing wrong with it. Some people invest in a backyard swimming pool, others a home theater system. But I'm not talking about your average person. I'm talking about those that spend the big bucks on bleeding edge technologies. Again, there's nothing wrong it. However, I find such behavior to be highly impractical let alone cost effective.

            Besides, first gen technologies tend to be riddled with bugs. Just hold out till a newer product revision is available. Less bugs, and a lower price.
    • by AWhiteFlame ( 928642 ) on Thursday June 15, 2006 @08:07PM (#15545210) Homepage
      All 10 titles? You should -be- so lucky. Back in my day, we had -1- 240x180 AVI of a CGI dancing baby and we -liked- it. You young whippersnappers and your "1820q" and your "ePod" and your "skipe".
      • YOU LUCKY BASTARD! Back in my day, we had a half-finished 40x25 ANSI animation written in QBasic, and we liked that a hell of a lot more than you and your stupid dancing baby! Oh, and that new-fangled IBM PC contraption or whatever it's called? Mark my words, that thing'll never catch on!
        • 40x25 animation? QBasic? Luxury!

          Back in my day we flickered the segments of 7-segment numeric LED displays, and did it by punching in machine code on the hex keypad.

          (I can hear the next one coming: "Keypads!? You had keypads?.... We had to short out the contacts wi' our tongue, and put wires on our eyballs to see anything..." Although making flip-books from punch cards is probably more realistic.)
    • The point is to extract the maximum amount of cash possible from the obsessive early adopters before coming out with more reasonably priced models.

      Additionally, these early units usually contain all the bells and whistles in order to prove out every aspect of the technology. Later, cheaper models, often have expensive, but little used features dropped from the product.
    • The Fifth Element is a launch title.
    • "I just don't see the point of getting a $1000 device that can only play 10 titles (no matter how high-definition the titles and/or the point may be)."

      Perhaps not. I have to ask, though: Have you ever pre-purchased something? Waited in line for hours to see a movie? Purchased any computer related hardware the day it was released? If the answer to any of these questions is 'yes', then you really shouldn't be throwing any stones. I know I'm guilty. Everybody has their obsessions, and some have the mean
      • Analog technology is expensive. With CDs, there's really no point to mounting the player on the same sort of isolation platform that's normally used for electron microscopes. But with vinyl, it improves the signal to noise ratio. Of course, the increased signal might still be masked by other factors.
  • by the eric conspiracy ( 20178 ) on Thursday June 15, 2006 @07:22PM (#15544929)
    If it is like other Samsung video players the attention to details like black level etc. won't be that great. I'd wait on this one unless you have money to burn.

  • Blu-ray


    Cowboy Neal.
  • 1080p? (Score:4, Interesting)

    by sam1am ( 753369 ) on Thursday June 15, 2006 @07:27PM (#15544968)
    Anyone know if that is 1080p/60 or just 1080p/24? Didn't see this specified on Samsung's website or in the user manual.
    • Re:1080p? (Score:5, Informative)

      by Anonymous Coward on Thursday June 15, 2006 @11:18PM (#15546174)
      Capable of 1080p60.

      Of course... There isn't any 1080p60 CONTENT. And there isn't going to be, except maybe technical demos, for quite a while. Nearly all films are shot at 24p and any decent HDTV will inverse-telecine back to 1080p24 from 1080i30 telecined frames. And any video-source material out there will be shot in either 720p or 1080i, so really, the 1080p60 is just an additional tick-box on the feature list at this point.
      Heck, force 24 fps film to be output at 1080p60, and it could look WORSE than telecined 1080i30 because of cadence problems.
      • All true. Wish I could mod you up, but clearly I've posted. Doh. Thanks for the info.

        Anyways, while there is indeed no 1080p60 content now (well, none distributed), I think HDTV production may settle at some point in the next few years. So if tv show release were available on DVD in 1080p60, it could be useful.

        But mostly, was just curious. Was surprised to not see this specified anywhere.
  • by plasmacutter ( 901737 ) on Thursday June 15, 2006 @07:31PM (#15544986)
    let's do the economics:

    $999.99 for the player
    $40? for the disks

    only a few titles
    LOTS of drm infesting it and making it not play full res

    or i can just:

    take the pc i already have.
    open up a browser to TPB or Tspy
    search "HR-HDTV"
    torrent DL
    watch full res HDTV quality encodes for $0-$25 (have to have dvd-r's right?)
    and as a bonus, the last 720p movie i saw on xvid took up 3 gigs... you don't need blue-ray or hd-dvd.

    thanks hollywood for drawing out the r&d and forcing the added costs of tons and tons of DRM! yet another reason to engage in piracy!
    • by Jherek Carnelian ( 831679 ) on Thursday June 15, 2006 @08:26PM (#15545328)
      HR-HDTV - full res HDTV quality encodes

      If you think the HR xvids are equivalent to full res HDTV, you are missing out.
      They are only 960x540 and the bitrate is nowhere near enough to prevent artifacts like macroblocking and mosquito noise.

      Don't get me wrong - the HR encodes are better than most any analog tv signal, but it is rare that they are better than a good DVD much less the equal of HD.
    • Most of the Blu-Ray discs on Amazon are $20 currently (search for "Blu-Ray").

      It's a good point that you can simply bittorrent a lot of HD content now (esp. TV), but Blu-Ray discs will probably look a good deal better and be easier to get. A really large HD torrent can take quite some time to aquire.
  • hmmm . . . (Score:5, Interesting)

    by Maradine ( 194191 ) * on Thursday June 15, 2006 @07:33PM (#15545000) Homepage
    It also up-converts conventional DVDs to 1080p to improve video quality and comes with HDMI, Component, S-video, and composite outputs.

    You know, I've always wondered about this, so someone help me out here. Let's say I have a 1080p HDTV. As it's a discrete pixel device, not a CRT, it's got one native resolution, right? And when I plug my 480i/p DVD player into it to watch a movie, the TV is upsampling the signal to use all of the pixels on the display, right? So why is this a feature on the player? How does it improve image quality? Is it using a blingy-er algorithm than the TV would be using? Marketing fluff?
    • Presumably the player does a better job of scaling than your display device. Nothing more. You could get a general video processor and output the native rate of your display.
    • Re:hmmm . . . (Score:3, Informative)

      by jbreckman ( 917963 )
      I'm not sure about this, but I think it's that DVD players do it better than most TVs. I know my 720p TV does an awful upconversion by itself - but with an upconverting dvd player it looks MUCH better.

      I've seen TVs that do it well though - it is just that some don't.

      Unless you aren't happy with how it looks right now, its probably not worth the investment.
    • It probably IS using a "bling-ier" scaler algorithm than the TV. Blu-Ray deals with HD resolutions and [optionally] more advanced codecs when compared to DVD. MPEG4 at HD resolution takes a shitpot of processing power compared to a DVD - when playing a DVD, they probably have ample processing power left to do scaling in software. Not that I know if they're doing that, or if they have a hardware scaler in there...
    • Re:hmmm . . . (Score:2, Informative)

      by llZENll ( 545605 )
      Since no one specifies what algorithm they use the only way you really know is to do some viewing tests. Most likely since the device supports BluRay it has a pretty powerful scaler and will look better than your TV scaler.

      Since the image is converted to digital it will be of better quality because it won't have to be converted to analog ever.

      For example:

      normal DVD player > converted to analog > analog signal over 480i connection > at tv > converted to digital 1080p > displayed

      bluray player
    • Assuming similar quality scalars, you're better off scaling once, from the original source data. Consider the extreme counter, upscaling from 480p to 481p, then with another device from 481p to 482p, ..., 1079p to 1080p; each scaling step introduces aliasing. Two steps may be small enough you don't notice, but theoretically one step should do better.
    • Re:hmmm . . . (Score:5, Informative)

      by interiot ( 50685 ) on Thursday June 15, 2006 @08:20PM (#15545293) Homepage
      The reason that some scalers are better than others is that, once you throw interlaced content into the mix, scaling gets a lot more complicated [dvdfile.com], and is sometimes just an educated guessing game.

      The reason it's in the player is because it's easier to upgrade your player to have a decent scaler than it is to upgrade your TV to get a decent scaler (lots of $$ just for the new TV), or to buy a standalone scaler (standalone scalers aim for the top end of the market).

      Ultimately though, you want a scaler that can work with many different inputs, so that your Dreamcast, DVD player, and your video recorder all look good. So having your best scaler be in the DVD player isn't optimal either. Fortunately, scalers in newer TV's are starting to get better (eg. with names like DCDi showing up more).

    • DVD players that specifically come with upscaling as a feature, tend to do a better job than TVs, in my experience. I assume this is because they are working with the original data, and upscaling as they decode, rather than upscaling the decoded MPEG stream...
    • Re:hmmm . . . (Score:4, Informative)

      by caudron ( 466327 ) on Thursday June 15, 2006 @10:05PM (#15545800) Homepage
      the TV is upsampling the signal to use all of the pixels on the display, right? So why is this a feature on the player? How does it improve image quality? Is it using a blingy-er algorithm than the TV would be using? Marketing fluff?

      Actually, that is precisely correct. Full motion image rescaling is a nontrivial task. The TV is rarely (though there are notable exceptions!) the best choice to do the scaling. You want the video to be rescaled before it hits the TV by something a bit more beefy and slickery than what the TV will through at it.

      The claim here is that the player's scaler is going to be better than the TV's, but probably not as good as a dedicated scaler. I'm sure you can turn the player's scaler off if you want that done by other equipment.

      Is their claim truthful? Who knows? Most likely is is better than the TV, but I've seen some good TV-based scaling.

      My home theater setup? http://tom.digitalelite.com/caudroplex.html [digitalelite.com]

      Tom Caudron
      http://tom.digitalelite.com/ [digitalelite.com]
    • In theory, the DVD player can do a better job of upsampling because it can grab data from before and after frames too to help with its interpolation. Hypothetically a TV could also do this but it would have to buffer the frames, and it'd be working off of the already-interpolated full frames (containing artifacts) rather than the raw data (well, raw compressed data) that the DVD has available.

      Whether any DVD player actually does this, rather than just do some simple graphic interpolation per frame like th
  • Improve Quality? (Score:3, Insightful)

    by thedbp ( 443047 ) on Thursday June 15, 2006 @07:37PM (#15545024)
    Is that even possible? Just like when you enlarge an image in Photoshop, all you're doing to approximating what pixels WOULD be there ... you're not adding any real new information to the image. How could this possibly improve a DVD image?

    This is an honest question. I'd really like to know if they have some special fancy way to truly fill in the gaps of resolution.
    • by Carthag ( 643047 ) on Thursday June 15, 2006 @07:57PM (#15545150) Homepage
      They've licensed the technology from CSI. You even get a voice recognition feature that lets you say "zoom... enhance... enhance"
      • They've licensed the technology from CSI. You even get a voice recognition feature that lets you say "zoom... enhance... enhance"

        The CSI guys do a lot more work than they need to. When they get a blurry photo of the killer, they just need to zoom in on a skin cell to the point that they can see the nucleotides of the perp's DNA.

    • Photoshop has two dimensions to deal with, a video scaler has three. There is a LOT of useful information in how frames change over time that can be used to figure out not just how to sharpen a scaled image, but how to restore detail that was lost.
    • Just like when you enlarge an image in Photoshop, all you're doing to approximating what pixels WOULD be there ... you're not adding any real new information to the image. How could this possibly improve a DVD image?

      True, but there are varying qualities of interpolation. Photoshop's preferences have options for interpolation like "bilinear" and "bicubic" which imply progressively better mathematical methods of sampling the surrounding pixels to interpolate the missing pixels. The more samples and the b

  • 1080p eh? (Score:3, Interesting)

    by skyman8081 ( 681052 ) <skyman8081@gmaiPERIODl.com minus punct> on Thursday June 15, 2006 @07:37PM (#15545025) Homepage
    Now how about a proper 1080p TV then? There are HDTV's that have a 1080p display, but don't take 1080p inputs, and TVs that take 1080p but downscale it to 720p. Make up your mind!
  • by llZENll ( 545605 ) on Thursday June 15, 2006 @07:47PM (#15545085)
    There are only 5-10 tvs that will even display 1080p right now, not even the Sony SXBRs can (NOTE many TVs actually display 1080p, but only accept 1080i input, like the SXBR for example).

    As can be seen on this chart [soundandvisionmag.com] 720p will do for for most people. The human eye can't resolve the extra detail in the picture from 8' on a 42" diagonal.
  • I wonder if anybody will actually use that. I can't imagine anybody spending that much money on the hardware, plus the extra cost of the disks, just to watch something at the same resolution as my $35 player offers.
    • It's handy if your high end digital screen is in the shop and you need to hook the player up to your old 27" analog temporarily. Especially if you've got a lot invested in HD/Blu discs that won't play on your old DVD player.

      Beyond that, though, you've got a point.
  • So it takes an image or frame that is not 1080i and gimp-fus it up to 1080i, and that improves quality? Or just "scales to 1080i and doesn't suck too hard doing it"? Where do the extra pixels come from?
    • Each HD-DVD and Blu-Ray player comes with magical fairie dust pre-installed (HD-DVD players use a lower grade of dust, explaining the $500 price differential). When the magical fairie dust mingles with the laser beam during playback, it creates a psychic connection to the mind of the director of the movie, which allows the players to then output what s/he actually saw during filming.

      Unfortunately, there are still some problems with boom mikes, crew members and buffet tables being visible in some scenes
      • it creates a psychic connection to the mind of the director of the movie, which allows the players to then output what s/he actually saw during filming.
        Ahh.. that must be Lucas' plan. Once that tech is released, all versions of starwars will look like what lucas thought he filmed all those years ago..
  • From TFA: "Yes, we are double the price of HD-DVD, but we are confident people will buy as many as we can build." Well if you only build like 50, then that's no problem now is it. Actually, I'm having doubts that people will buy even 50 of these. Who's going to spend $1000 to watch 10 different titles?
  • by WasterDave ( 20047 ) <davep @ z edkep.com> on Thursday June 15, 2006 @08:14PM (#15545255)
    "Thank god for that, the PS3 is starting to look halfway reasonable again".

  • If there's one thing I've learned from these format wars (Blu-Ray vs HD DVD), it's that "announced" dates don't mean squat. HD DVD was supposed to launch in time for 2005's Christmas shopping season, but was actually released on April 18. The Sony PS3 was supposed to launch in March 2006, but won't be available until November (at the earliest). Sony says Blu-Ray-related technologies are the main reason for PS3's delay, but I'm not sure I believe it (November?).

    At this year's CES, it was announced that Blu

  • >> It also up-converts conventional DVDs to 1080p to improve video quality and comes with HDMI, Component, S-video, and composite outputs.

    Uh.. yeah.. so does my Oppo DVD player that cost about $150.
    • The oppo, (which is regarded as one of the best upsampling DVD players) doesn't play bluray disks. Although an upscaled image can look very good indeed, a true HD image looks even better. Moreover, even the best DVDs suffer from compression artifacts.

      The Oppo, btw, lists for $199, not $150. I suppose you can get a refurbed item cheaper, but then it's a matter of "who do you trust more, the manufacturer or the refurbisher?"
  • The pixels are not there to begin with, so how is the upconverting making it 1080p?

    Isn't it interpolating the 480p DVD pixels to generate enough pixels to fill 1080p? So is that really 1080p?

    Sure there's enough pixels to fill 1080p, but since the source was originally 480p, it doesn't sound like true 1080p.

    Probably better to get a Bluray title and have native 1080p on the disk.
    • Say you've got an LCD monitor with a native pixel resolution of 1080x760, but you want a desktop resolution of 640x480. You can do it two ways.

      1. Use a native 640x480 screen. This will only fill up the center portion of your monitor and will result in a black frame around the border. Simple reason is that your monitor has more pixel area than what your desktop resolution is set to.

      2. Stretch the 640x480 picture to 1080x760 which will use up the entire screen. Due to the interpolation (approximation), you en
  • Price differential (Score:3, Insightful)

    by Doomstalk ( 629173 ) on Thursday June 15, 2006 @08:25PM (#15545319)
    Yeesh, just look at that price! Unless they can drop its cost in a rapid fashion, the Blu-Ray format is almost certain to fail. Even earlier adopters, who usually see price as no object, are likely to balk at a price like that. A quick search on Froogle finds the Toshiba HD-A1 player for $620, a little less than 2/3rds the price. Technologically inferior or no, that's a hell of a price differential to overcome.

    The Playstation 3 is likely to bring prices down, but honestly I think Sony put Blu-Ray tech into the system too soon. DVD was nearing critical mass in 2000, and the Playstation 2's arrival just hammered it home. HD formats, on the othr hand, aren't likely to explode for another couple years- at which point the PS3 will have sank or swum on its own merits. Having an Blu-Ray drive in the PS3 by default is more likely to be weight around the system's ankles, rather than a buoy to the top.
  • Composite outputs? (Score:3, Interesting)

    by fbg111 ( 529550 ) on Thursday June 15, 2006 @09:02PM (#15545539)
    Why composite? Seems like a complete waste of money. First, anyone able to afford $999 for a BD player, or whatever reduced but still expensive price this thing will cost until PS3 arrives, will not be watching it on a TV so cheap/antiquated that its best input is composite. Second, I'm not a videophile/home theater buff, but I can clearly remember the vast difference in image quality on my Xbox and PS2 when going from composite to component and composite to s-video, respectively. Seems like composite on a BD player negates the primary selling point of the BD player in the first place - image quality. The only reason I can think of for including composite is that composite is now so cheap that its inclusion has a negligible effect on the unit's manufacturing cost, in which case, why not? Anyone know?
    • Like you said, composite outputs cost next to nothing. And, more importantly, EVERYTHING works with composite. What if you need to hook up your player to a TV that doesn't have HDMI or component inputs? Or even s-video? You'd have to use the composite outputs. Hell, you can even hook up a composite source to an RF input with a cheap adapter. Yeah, it won't look as good, but it will still WORK.

      It's the same kind of good marketing logic that puts a USB-to-PS2 adapter in the box with almost every mouse and
  • Wow.. double the price for non interlaced video!!! What a compelling feature! AND they toss in dynamically updating DRM for FREE. Sign me up!

    But seriously, screw BluRay. I think Sony is seriously overestimating the influence of the videophile. At least in this country (the US), most people don't know HD from a hole in the ground. It's -so- rare that I actually find someone with an HDTV that's actually using the damn thing properly... even if they have access to content.

    Heck, forget HDTV, just look a widescr
  • Surely all possible early adopters have already ruined their eyesight like the rest of us tech-heads? I can't tell the difference between DivX/XviD and DVD unless someone bypasses my eyes altogether, and I wear weak lenses.
  • I can't wait to plug it into my hdt... oh wait, never mind.
  • Someone will defeat the DRM in the HD format... so that they can rip and re-encode / compress it down to a CD-R sized movie.

    LOL in advance.

Can anyone remember when the times were not hard, and money not scarce?