yes, but, there is so much layers that are supposed to smooth the hardware difference:
- canvas operations are raster-based and lossless
- browser scripts (either ecmascript or another) should provide consistent execution: whatever the underlying hardware, if I ask JavaScript to draw a circle with (x,y) center and r radius, the result should be predictable, and not hardware dependant
- even considering that browsers use "hardware acceleration" as a way to speed things up, there is still at least one layer between the software and the hardware (either an opengl driver, or some other monstrosities drivers) that *should* provide reproducible, consistent result with various hardware
Now, I perfectly understand why neither the browser, the OS API, and the driver would bother to provide perfect results: we're trading performances for accuracy. After all, if I draw my circle with 0.1 pixel of error, it will look good because of antialiasing. But I still think that software results that are independant of external input should not vary from one hardware to another. There is only one good output for a deterministic software function when always providing the same input.
Imagine the horror if different processors would return different values when computing 1/0.999 just because they have different hardware (oh wait, this one kinda happened :D)