161387
submission
VincenzoRomano writes:
I'm wondering whether the low and slow penetration of technology in the USA can be explained with anti-science behaviours and beliefs like creationism.
To cut short a quite longer story, almost all current technology has been thought, designed and built thanks to the Science and its scientifc method.
If you refuse the latter, maybe, you are not willing nor can use the former. Unless you are going to selectively negate the science, which in my opinion makes even less sense.
Are the USA headed towards a technology regression era?
What'd be the Slashdotters opinion?
114721
submission
VincenzoRomano writes:
Nowadays we can choose among dozens of active distributions, at least three or four desktop environments, half a dozen of web browsers and office suits.
Every single choice can show its own pitfalls resulting in a incomplete hardware support or even in unstable or unmaintainable installation.
More sadly, almost all distribution teams moan about the lack of resources, both human and economic.
This is clearly due to the fragmentation. Such a thing doesn't seem to happen for *BSD systems where the choice can be done on the fingers of one hand.
Wouldn't you promote some kind of consolidation in the distribution arean for the sake of stability and in order to have a better use of the available resources for a better Linux?
101172
submission
VincenzoRomano writes:
One year ago I decided to buy some "enterprise grade" hardware, firewalls actually, in order to replace the old ones used by the former ISP.
Before buying them I did a kind of survey. I browsed the product "data sheets" from the manufacturer web sites and in some cases, asked for more details by email.
I finally choose a top product already in the market since one year and a half from a very well known and reputable company.
The product showed a nuber of issues as soon as unpacked and put to work. I mean things like not being able to keep a VPN up and running for more than a bunch of minutes or doing bad IP routing on the LAN.
I've spent the last year to make that equipment working accordingly to both their data sheets and the features expected from an "enterprise grade" product.
Important issues are still open while the technical support is actually relying on my own stuff and setup and on my personal availability in order to do troubleshooting, firmware beta testing and other experiments.
I've finally blamed the product as
"far from being ready to market or even usable for beta testers"
and have requested some kind of compensation for all the job I had to do.
What's your opinion about such a behaviour in a company? Is it fair?
87778
submission
VincenzoRomano writes:
It's a common use for manufacturers to put products on the market that are not completely and fully working.
Then they rely on users' feedbacks and often support in order to debug, trobleshoot and fix the products.
My main focus is on professional and business networking products, but I think this applies to other kind of products.
Sometimes those troubles are quite specific and rare. But more and more often the products are delivered as barely working
or even not working at all, even on the very basic features.
In other words, products can be delivered with little or no QA at all.
I think this is not fair, especially when they ask you to collaborate (for free of course) to troubleshoot the problems.
I'd lke to pay, say, 50% at the buy and 50% when the product is fully working. Or define a hourly wage for the time I spend for them.
What'd be your opinion and experience?
Should we ask for some kind of reimbursement?