Forgot your password?
typodupeerror

Comment: The story behind Apligraf (Score 5, Informative) 157

by az-saguaro (#30099390) Attached to: The Mass Production of Living Tissue

I am not sure why this item was introduced as "moderately disturbing". If you will permit me, I will explain what it is, since I use it regularly. (I have no stock nor other biased affiliation with the company or product.) The product first came on the market in 1998, over 10 years ago. It has a well-established place in the treatment of chronic wounds. It is not the only product in the category of "living cell therapies" for chronic wounds. The other product is Dermagraft, similar, and likewise around for nearly 10 years. When Apligraf first came out, it was promoted as skin-graft-in-a-box. It is not. It is allogeneic material (recipients will reject it), and thus it does not "take" to the body like an autogenous skin graft. In its earliest years, when the company was promoting it as a skin graft, it got some high profile press because it was put to good use as readily available biological coverage for burn victims of the 9-11-2001 twin towers catastrophe. The company that makes it, Organogenesis, partnered with pharma giant Novartis for marketing and product management, and under them, they listened to customers who told them it was not skin grafts in a box, and they redefined its marketing for chronic wounds. The product management has been back in the hands of Organogenesis for about 3 or 4 years now.

The material is essentially a poly-pharmaceutical packaged in a living material. The raw materials come from donated foreskins. Extensive safety testing is done. Pure extracted fibroblasts are put into cell culture, where they do their business and re-form a collagen matrix equivalent to normal dermis. After that, pure keratinocytes are cultured on top of the dermis, and an epidermis forms. The product is shipped in its petri dish, as a circle of 44 sq cm area. The Gizmodo article shows a picture of it. As a living material, its procurement and handling are a bit different than most medical devices, but it is easy to get and apply.

The juvenile cells in the material make a broad spectrum of growth factors and other biochemicals which have a positive pro-proliferative effect on wounds. The role for this material is for chronic and pathological wounds. The company got its market approval and indications from the FDA for studies done on diabetic and venous ulcers, but the material is useful for chronic and pathological ("cap") wounds of any cause. Like anything else, it does not work for all wounds or patients, but it is fairly predictable, and its results can be rather dramatic. When a cap wound of whatever cause has been treated to the point that disease is quiet, inflammation is gone, and the wound should be healing but it is not, then that is when wound stimulatory therapies are applied. There are several available, and Apligraf has been one of the flagship products in this category for 10 years now. Many wounds which simply will not budge no matter what will take off and heal once this is applied.

Organogenesis has its first new product coming out soon, for oral mucosa and gingiva, so perhaps that is why they are trying to stir up some attention with articles like the one quoted. However, it is not Brave New World nor Coma nor any other meat factory. It is just on the leading edge of biological therapeutics in the 21st century. And if Slashdotters want to make lots of jokes as they often do, like "put Viagra in the petri dish to grow more", well, we've already heard them all.

(All very timely, since I just gave a presentation on this last week (and have been for 8 years). If you want to learn more, I posted a copy of the presentation on the website I use for posting talks and presentations and whatnot. This particular talk has a mix of my slides and company slides. It is NOT yet annotated with full text on each slide, so some will just be pictures and you will have to infer what you can, but text should be coming one of these days:
http://www.arimedica.com/content/arimedica_apligraf_(partially%20annotated)_2005-1006.pdf
Again, I have no investment nor bias here, I just use this stuff in practice because it works and it's an important product.)

Comment: MS tender loving customer care. (Score 1) 273

by az-saguaro (#28815535) Attached to: Microsoft Exec Says, "You'll Miss Vista"

"I think people will look back on Vista after the Windows 7 release and realize that there were actually a bunch of good things there . . ."
-
Q: If an MS exec thinks that there is some goodness in Vista that people will miss, then why would they throw away that goodness?
A: They're Microsoft.
-
I have Vista on my Laptop. I have XP on may main desktops. I just put XP and Linux on my new netbook. All things considered, I think that the overall Vista experience is a smoldering pile of pig droppings. But amongst all the turds are a few nuggets of digestible goodies. I can certainly see where some users will have gotten used to using Vista, and might miss what they have become accustomed to - that's just normal human nature. I hate it, but I have gotten used to it myself, so I can see the point.
-
If they know there are features that customers like, then why eliminate them from 7? The exec said so himself - people will miss something. I take that to mean that they know there are some things people like, and they are deliberately removing them (cynical view), or they don't give a crap that they are removing them (existential view).
-
His remarks sound like "We know Vista sucks, but wait 'til you see what we did to it in 7 - then you'll finally get it how good Vista was - hahahaha!" Or am I missing something?
-
If you want to give your customers a good experience and make them happy, keep the good things in. I have no idea what features he might be referring to, but if he knows that users like them, and that they are going to miss them, why would a corporate exec want to make the experience bad, nostalgic for the old system? Just wondering. It makes no sense, unless you are in the MS parallel universe, I guess.
-
Like when MS took the best feature of its Office suite, the ability to customize and extend the apps, toolbars, and menus, then stripped it all away in favor of the 2007 Ribbon of Shit. "People might miss the old toolbars . . . screw 'em."

Comment: Get a 100-in-1 for ideas (Score 1) 364

by az-saguaro (#28710569) Attached to: Low-Budget Electronics Projects For High School?

Lot's of good suggestions here. Here's what I would do. Get yourself one of those 100-in-one project kits, or 400-in-1, work through it, and from there, pick a handful that would work with your students. You can then buy the parts you need in bulk - a 1000 resistors here, a 100 caps, 50 transistors, 100 diodes, etc - all cheap. Put together a parts box with all that stuff, including wire, solder, bulbs, led's, battery clips, alligator leads, etc. Then get a few basic durable goods - soldering irons, a multimeter, etc. You will also need some breadboards and solder boards. Each year, you can add a few projects and components, and soon enough you will have a real electronics workshop.
-
Whatever specific projects you choose, they should represent the basics: basic circuits, basic components, r-c, basic logic, gates, timing, oscillators and mv's, mux-demux, comparators, etc. Five or ten basic building block IC's should be able to cover hundreds of projects in these basic categories. There is nothing like a good ol' 555 for timing projects, 7400's for starter digital logic, and a 741 for introduction to analog concepts. Almost any project you will find in a 100-in-1 kit will be easy enough to build for under $5 once you have the basic lab or shop equipment.

Comment: Re:What happens when chloroplasts are removed? (Score 2, Insightful) 168

by az-saguaro (#28588981) Attached to: Unicellular "Enigma" Changes From Predator To Plant and Back

Correct, that is our conventional understanding of things. But what if there are other primitive energy capture and translation systems that remain repressed or down regulated by the presence of these structures? What if a cell could be kept on "life support" for a few hours or days after removing its mitochondria or chloroplasts, enough for up regulation of latent genes that will revert the cell back into a some sort bacteria-like mode of metabolism? Granted, it is much less likely for advanced eukaryotes like mammalian or insect cells or rose bushes, but what about for algae or diatoms or sponges? We can presume that at some point endosymbionts and cells became so entangled that neither could survive nor revert without the other. However it would also seem likely that there is a transition group of species which could still be unentangled in the lab.

Comment: Re:What happens when chloroplasts are removed? (Score 4, Interesting) 168

by az-saguaro (#28588055) Attached to: Unicellular "Enigma" Changes From Predator To Plant and Back

Yes, you are exactly correct, that sticking chloroplasts into animal cells would be the necessary flip side of that experiment.
-
I was not referring to turning pine trees into Night of the Living Dead. What would be interesting is to see what would happen to algae under these circumstances, or to cultures of moss cells or flowering plant cells. Pick a popular research plant - tobacco for instance - and then pull the chloroplasts out of a few cells, then stick them into a cell culture medium - e.g. agar petri dishes or mammalian cell culture flasks - and see if they become planktonic, aggressive, nutrient-tropic, or if they start to express cell surface structures or other organelles related to sensing and locomotion. Since the algae are phylogenetically much closer to all of this, it seems plausible that they might revert to animal-like forms and function.
-
If nobody has ever done these experiments, now would be a good time.

Comment: What happens when chloroplasts are removed? (Score 5, Interesting) 168

by az-saguaro (#28587583) Attached to: Unicellular "Enigma" Changes From Predator To Plant and Back

Biology is full of promoter-inhibitor relationships, and this seems like an interesting one. When the algae is inside the protist, the host's "animal" behaviors and anatomy are suppressed, but they clearly remain in a latent state, ready to reactivate after fission. It makes one wonder to what extent chloroplasts remain as endosymbionts versus organelles in genuine plant species. So . . .
. . .
Does anyone know of any research where chloroplasts were removed from plant cells in culture, to see if the remaining cells revert to some atavistic animal-like exogenous-food-seeking state?

Data Storage

+ - Billion year "hard drive" from carbon nano->

Submitted by az-saguaro
az-saguaro (1231754) writes "The journal Science reports on a data storage technology that could be stable essentially forever. It is based on nanometer scale iron crystals slipped into a carbon nanotube. With one crystal per tube, data is stored, 1 or 0, depending on which end of the tube the crystal is at. "Which end" is evidently an electrically writeable property, and the accompanying video shows it in action. Because of the stability of carbon nanotubes, this technology could theoretically be one of the most archival media ever."
Link to Original Source

Comment: Missing the mark on evolving technology & mark (Score 1) 435

by az-saguaro (#27830875) Attached to: First Look At Windows 7 On an Entry-Level Netbook

This article and related comments make me think that netbooks will be a problem for Windows in ways that were not fully anticipated. There are two premises to this:

1) Where is the MinWin concept in all of this? It sounds like the concept of a small snappy kernel is only relative to where Windows was with Vista. If it just barely runs on an Atom based system, with no capacitance or wiggle room for big apps and data, then how could this ever be ported to ARM and various mobile devices like phones / PDAs etc?

2) Netbooks are about to become the new laptops. The current laptop form factor has been around about 15 years, successful only after technology worked its way through various large incarnations of "portable PCs" and performance became commensurate with desktops. Now, we are entering another era of miniaturization, and Netbooks at about half the size of laptops seem to be a nice balance between smaller size but usable screen and keyboard. And don't forget that iPhones and the like are a winning form factor as well.

The problem is that people are not going to settle for netbooks being just a glorified PDA or internet kiosk. They will see a familiar user interface, it will run most of their apps (even if slow and kludgy), and it will look and feel mostly like their desktops and laptops. So, expectations and demands will rise. The industry will respond by making chips, boards, screen technologies, etc. more capable, and within 5 to 10 years, smaller form factor netbooks will rival the performance of today's laptops and even today's desktops, just as 15 years ago "laptops" supplanted "portable PCs". The current form factor of laptops will not disappear - laptops are successful for a reason, and large screens and keyboards will remain of crucial importance for many users - but people will come to expect the smaller form netbook, handy and easy to carry, to nonetheless perform as a desktop and run all their apps while on the go.

Hardware makers will make these goals possible. In the meantime, people are starting to become familiar with other OSes as the cellphone-PDA-mediaplayer class of devices becomes more pervasive. As such, what people really want is easy, smooth, intuitive, bug-free transparent performance, and not necessarily a single given proprietary look and feel. This article and thread makes me wonder if MS and Windows are going to end up being perpetually a step behind, planning products based on today's marketplace and technologies, but delivering the product several years later when technology and people's expectations have moved forward.

Comment: "Dot com" just did not compute for them. (Score 3, Informative) 141

by az-saguaro (#27767767) Attached to: Time Warner To Spin Off AOL

You are completely right, but they never would have changed their name to "Time Warner.com or something idiotic like that". I don't think that "dot com" really meant anything to them; they really didn't understand how the world was changing. They were stuck on the AOL way of doing things, which was most definitely NOT "dot com".

Part of that whole mess was just raw psychology: hubris, blindness, old fogeyism, and getting run over by the bullet train of market reality. In the period circa 1998 - 2001, Win 98, Internet Explorer, DSL, cable broadband, and the dotcom boom all turned the world en masse to the real Internet. While AOL saw opportunities in the Internet, it was so tied to its own version of online services, a glorified dialup bulletin board service, that it never saw where the rest of the world had suddenly detoured to. The AOL - Time Warner merger came after the ascendency of AOL, as they were starting to become irrelevant.

Hubris - thinking that flash-in-the-pan AOL could take the leadership role from well-established and dependable multimedia Time-Warner. Blindness - letting their hubris and rose-colored vision mask what was happening with the real Internet and ISPs. Old fogeyism - believing that their traditional ways would prevail, as the whole world was giving up roller skates for sports cars.

Not surprisingly in retrospect, but perhaps not predictable at the time, is that consumer tastes in the media itself changed. Time Magazine and Turner Classic Movies remain important, but who then necessarily realized that the likes of YouTube, FaceBook, the blogosphere, and all of their forebears of the time would divert attention from classic print and TV media.

At that time, they just didn't get it, what "dot com" was really all about. They were all going to lose value anyway, but kudos to Steve Case for sucking something out of the stodgy and clueless old guard - like it or hate it, you gotta admire it.

Comment: Analog modeling would be better for disease states (Score 1) 521

by az-saguaro (#27334819) Attached to: Microchip Mimics a Brain With 200,000 Neurons

Your point is well taken. A project like this is trying to mimic normal neuroanatomy and physiology. What happens when the brain becomes acutely or chronically pathological is much more complex. When the nervous system is in its usual healthy state, then the kind of digital architecture implemented in this project can work to a degree as a simulator. For instance, when a synapse triggers, it is in a sense an all-or-none saturated response or switch, so digital logic is useful to model it. However, neurons and their interconnections are more analog than digital, especially during conditions of disease. For instance, changes in the levels of a neurotransmitter or its receptor, or electrolyte changes which will attenuate the response to the transmitter, create variable responses, more like putting a potentiometer in the circuit, rather than an on-off switch. In fact, the responses of a neuron to its many synaptic inputs will exhibit varying degrees of time base integration, multiple input summation, and selective signal rejection, while on the output side, there can be amplification or attenuation, and oscillations or chaotic dynamics in lieu of one-shot trigger responses. This is the perfect place for large scale analog circuit modeling.

It seems like digital chips and digital programming, simulation, and control have become so entrenched (or cheap) in the mindset of designers and users, that analog gets short-changed when thinking about modern large-scale models. Designing a VLSIC with thousands of opamps, and making them addressable through thousands of addressable DAC's, and then ganging thousands of them together into your "brain computer simulator" would be a daunting and expensive chip design challenge, but ultimately far more realistic. There are some things, like the pathological states that you mentioned, that just cannot be effectively simulated on a digital chip - a software simulation yes, but not in digital-only silicon, or so it seems to me.

Comment: It's all about to change. (Score 1) 1316

by az-saguaro (#27203475) Attached to: Narcissistic College Graduates In the Workplace?

The original post may be based on incidental observations, meaning it is true for some people, but I don't think that a generalization can be drawn. In my own work, I see some young grads who are lazy, some who are full-bore energetic self-starters with realistic understanding and expectations, and everything in between.

"TV reality college graduates" has some truth behind it, because TV can glamorize your appreciation of things, or else over-romanticize them into unreality. For instance, who would have thought that being a chef would be glamorous, but the Food Network has propelled chefdom into stardom and cooking schools into sought after trades - that's glamour. But how many people have romanticized about being a TV or sports celebs thinking that they too can be a star, only to have reality bite back when you try to get your big break.

IT and tech jobs have been glamorized to some extent on TV, and so have business-MBA-entrepreneur models, so no surprise then that someone who is fundamentally immature and unrealistic might seek to be a "22-year-old who leads billion-dollar corporate mergers in Paris and jets around the world". I think three things are about to change those attitudes:

1 - With the world economy in a meltdown, and major banks and financiers dissolved, being a money-grubbing MBAstard is about to get less glamorous. Anybody looking to be that 22 yearold billionaire will get some serious ego deflation in the coming job market.

2 - If you want to work from the tech rather than the business side, well guess what, lots of companies cannot afford tech right now, so, no soup for you! Just like the wannabee starlets who wait tables, you might only find employment these days doing the same, or accepting get-dirty infrastructure work that the government is now starting to fund. Nothing like digging a ditch to burst your bubble.

3 - All new industries have a certain dynamism that is inherently glamorous. The PC revolution made lots of MS millionaires. The dot-com era made lots of dot-com millionaires. The net-Google era is making its own millionaires. But these major emerging technologies and societal transformations are already here, and their entrepreneurial heyday might be over, or at least in a lull, until some other new major tech and industry comes along, such as perhaps green energy. With the current worldwide economic slump, glitzy make-millions jobs for young grads riding the wave of a new industry just aren't there right now, at least not in computers and tech, as far as I can see.

Comment: Hard drive cloning - easy, safe, secure (Score 2, Interesting) 189

by az-saguaro (#27177417) Attached to: Windows Security and On-line Training Courses?

This thread has generated a lot of great responses, and you can pick and choose from a variety of good solutions. Here is another, the one that I have settled on as my preferred safety-backup-reinstall method: hard drive clones.

I use XP-SP2. My main machine has been running smooth as silk for 4 years. I have had rare problems, but when they have occurred, they have been of mixed causes - hard drive failure, a UPS failure which caused unbootable file system corruption, and even a trojan picked up right here on a Slashdot link a few months ago. No sweat for me though . . .

My backup solution depends on external hard drives which mirror my internal drives. I keep all data and apps (other than those that insist on installing under \ProgramFiles) on separate internal drives. That way, if C: gets corrupted, my other data is safe. My C: system drive has only the OS and ProgramFiles apps. This means that I can keep the system drive relatively small (120GB), meaning I can buy several mirror drives quite inexpensively.

I have several C: drive mirrors. I duplicate my main drive to these external backups 2 or 3 times a week. I duplicate just before any major system or application upgrade. I use an older version of Norton Ghost (v9) for this, which makes flawless duplicates while running in the background. (I also use Acronis to make point-in-time compressed images of the drive, which can be reloaded onto a hard drive if need be.)

The few times that I have had a disaster, I just pull out my latest mirror, swap it into the disk-0 position, and turn my machine back on - like nothing ever happened.

Consequently, this is also a great way to test installations or new software, or to create drives that you or your wife could use for your own purposes.

(See the comments above by diggitzz about cleaning up your dirty system before getting ready to make your first mirror image.)

Ever since settling on the system-drive-mirror solution for my OS safety backups, I have not had a moment's anxiety about losing a drive, testing new OSes, nor keeping my installation clean.

Comment: Problem cured with Scotch tape and the splicer (Score 1) 160

by az-saguaro (#27085807) Attached to: A History of Storage, From Punch Cards To Blu-ray

Back in high school c1970, we got the coolest toy - a rack mount high speed paper tape reader to feed our PDP-8S. We could load our 4K Fortran, and still have half the memory leftover for programming. Evidently our school had a bigger budget than where you were, because we also got the fanfold tape splice & repair gizmo.

Comment: Gimme those MO geek cred points (Score 1) 160

by az-saguaro (#27085443) Attached to: A History of Storage, From Punch Cards To Blu-ray

From the original article: "Magneto-Optical Drive . . . If you've ever owned one of these drives, award yourself 100 geek-cred points, and 1000 points if you still own one."

Gimme a couple o'thousand of them geek cred points!

I had three Fujitsu MO drives, on line from about 1997 to 2002. I used them for hard drive backup and off-line storage. The reason was simple - best cost-per-megabyte of all media during that period, plus luxurious high capacity by standards of that era.

The rules are simple: every advance in processor speed, memory, hard drive capacity, screen resolution, app complexity, file formats, I/O interfaces, and I/O devices results in users generating bigger and bigger files, more and more data. Hard drives are always at the head of the curve on sheer capacity, but until the past few years, hard drives were also expensive. How then do you find the best balance between economy of hardware costs, economy of media costs and storage space, and economy of time to write files and backup. In that era, late 90's, hard drives were roughly in the 500MB - 4GB range, and pricey. So how do you backup? I used CDs, DVDs, Colorado/QIC tapes, and then MO's.

CDs seemed great c1992-1994, but write speeds then were sloooow, their capacities quickly became inadequate, and they were pricey until DVDs came along.

DVDs were just like CDs, just a generation farther in terms of speed and capacity, but with the same caveats and shortcomings. Whether CD or DVD, these media were simply behind the curve compared to HD capacity and speed.

(Blu-ray simply extends these same issues to another generation. Optical is useful as a medium of large file exchange and distribution, but until someone comes up with a 500GB or 2TB 5.25 optical disk for $5 that can do a full write in 30 minutes, it is mainly useless for most backup tasks.)

QIC tapes were great for total system backup and restore, but only for relatively small HDs, and they were slow for random data access, the tapes were pricey, and the technology was being phased out by 2002.

MO fit the bill for robust high capacity affordable storage. It was the genuine diamond-in-the-rough. I could not find my little file of calculations from back then, but the drives were affordable (comparable to any CD recorder), the media were a fraction of the cost-per-megabyte of writable CDs (and then DVDs), read/write access times were way better than CD, the media were sturdy and well protected. Capacities were 640MB, like CDs. but reusable, faster, cheaper. It was all good things.

I never understood back then why it wasn't more popular, because it was superior in almost all respects of usability and expense. Technologically, I suppose there will be experts here who can comment on that, but it seemed like a fairly dependable technology, which lives on today as DVD-RAM (also dying). I suppose that the companies who made it just never organized the way that Blu-ray or HD-DVD consortia organized and pushed for their formats.

I have long since copied all of my MO disks to more contemporary storage, but my Fujitsu SCSI-interfaced DynaMO drives are still here on my shelf, ready to power up anytime I want to plug in - and yes, they work just great.

All of these discussions become moot when GMR was discovered (giant magneto-resistance), suddenly pushing HD capacities from 4GB to 18GB, and then onto today's TB capacities. Today, there is no pragmatic way, for the home / office / small business user to backup PCs with a few TBs of data except by using other HDs. And since HD prices have had a steep decline, the cost-per-GB is dirt cheap these days. My own backup strategies for the past few years have been exclusively HD-to-HD, having MULTIPLE backup sets at all times that are cheap, fast, and run-in-the background. HD-to-HD backup far superior to any CD-DVD-QIC-MO-BluRay solution - heartily recommended to all users - and that advice earns me a another couple o'thousand of them geek cred points!

Remember the good old days, when CPU was singular?

Working...