Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Amazon Betas 'Elastic' Grid Computing Service 78

RebornData writes "I receieved an e-mail this morning inviting me to participate in a limited beta of Amazon EC2: the Amazon Elastic Compute Cloud. It's a grid computing service that allows you to create and upload your own Linux-based machine images and run them in Amazon's system, starting at $.10 per "instance hour" (each machine instance being equivalent to a 1.7GHz Xeon with 1.75GB of RAM, and 160GB disk). You can use their tools to create and start new instances dynamically to meet whatever your particular capacity needs are at any given moment. Fedora Core 3 and 4 are explicitly supported, but any distro based on the 2.6 kernel should work. The service documentation provides more technical details. Unfortunately, it appears that the beta is limited to existing Amazon S3 users, and is already full."
This discussion has been archived. No new comments can be posted.

Amazon Betas 'Elastic' Grid Computing Service

Comments Filter:
  • by neonprimetime ( 528653 ) on Thursday August 24, 2006 @01:04PM (#15971090)
    Ok, so what is just me, or do the first to links on the post point to the exact same spot?
    Maybe they meant the Technical Documentation [amazonwebservices.com]?
  • by w33t ( 978574 ) on Thursday August 24, 2006 @01:17PM (#15971213) Homepage
    The concept of virtualization is so seductive.

    In our server room we have recently begun virtualizing servers and as a result have begun to think not in terms of physical servers and hard disks anymore, but in terms of resource pools of storage and processing.

    It's like we have been able to smelt our physical machines and from the molten resources forge anew.

    The recoverability and fault-tolerance is amazing as well - if a physical box dies there is basically no interruption in service. If something goes awry with an image we can just pull it and restore from yesterday.

    Seeing Amazon offering what seems to be more of an ocean of resource than a pool is very tantilizing.

    I'm certainly not the first, but I wonder if indeed local operating systems and cpus will become something of an anacronism, and that most processing will someday occur via the internet: that it will become the world-wide-mainframe.
  • Cost sheet (Score:3, Informative)

    by DAldredge ( 2353 ) <SlashdotEmail@GMail.Com> on Thursday August 24, 2006 @01:25PM (#15971294) Journal
    Pricing

            * Pay only for what you use.
            * $0.10 per instance-hour consumed (or part of an hour consumed).
            * $0.20 per GB of data transferred outside of Amazon (i.e., Internet traffic).
            * $0.15 per GB-Month of Amazon S3 storage used for your images (charged by Amazon S3).

    Data transferred within the Amazon EC2 environment, or between Amazon EC2 and Amazon S3, is free of charge (i.e., $0.00 per GB).

    Amazon S3 usage is billed separately from Amazon EC2; charges for each service will be billed at the end of the month.

    (Amazon EC2 is sold by Amazon Web Services LLC.)
  • Use Amazon S3 (Score:3, Informative)

    by RebornData ( 25811 ) on Thursday August 24, 2006 @01:49PM (#15971512)
    This service is paired with the Amazon S3 storage service, which has a high-bandwidth connection to the servers. Data transfer between EC2 and S3 is free.

    -R
  • by funfail ( 970288 ) on Thursday August 24, 2006 @04:54PM (#15973404) Homepage
    They sell the bandwidth separately and it's not cheap.
  • Re:Use Amazon S3 (Score:3, Informative)

    by cduffy ( 652 ) <charles+slashdot@dyfis.net> on Thursday August 24, 2006 @08:47PM (#15974810)
    Upload it as it's generated, so you aren't waiting until just before you run your batch to do the transfer all at once.

Always draw your curves, then plot your reading.

Working...