Forgot your password?
typodupeerror

Comment: Re:So does scratching your nose (Score 1) 208

by cjonslashdot (#48086925) Attached to: Studies Conclude Hands-Free-calling and Apple Siri Distract Drivers
Yes, tuning the radio is very distracting. In fact, cars that pop up alerts are very dangerous IMO. And the worst designs are those that have modal displays: e.g., when the radio shows either the time or the station and you have to toggle to see one or the other - that takes your eyes off the road. One thing is for sure: dialing a phone is _very_ distracting - as much as texting. I agree with you that talking on the phone is not a great idea in general. I am just not ready to completely eliminate it, because I think that sometimes it is a rational risk if one compensates by being extra careful. But again, people in general do not have the best judgment about these things, so perhaps it should be banned. I personally am really looking forward to driverless cars!

Comment: Re:So does scratching your nose (Score 1) 208

by cjonslashdot (#48084411) Attached to: Studies Conclude Hands-Free-calling and Apple Siri Distract Drivers
I am not like those people who talk on their phone all the time, including when they are picking up their kids. I am on my cellphone rarely, but when I do use it, it is really beneficial, and I am very careful. I tend to agree with you that many people are not so careful and they use it too much behind the wheel. In the morning I see so many people chatting on their phones while driving. I think that since their risky driving puts us all at risk, it might be better to limit cellphone use while driving, but it is a shame, because it penalizes those who are very careful.

Comment: Re:So does scratching your nose (Score 1) 208

by cjonslashdot (#48083569) Attached to: Studies Conclude Hands-Free-calling and Apple Siri Distract Drivers
So that is a judgment - an exercise of intelligence. You are making a judgment that turning the radio knob will not put you in danger. Presumably you do it at a moment when you have several car lengths in front of you. Also, have you ever arrived at a destination and then realized that you don't remember anything about driving there? Perhaps you were lost in thought the whole time...

Comment: Re:So does scratching your nose (Score 1) 208

by cjonslashdot (#48083179) Attached to: Studies Conclude Hands-Free-calling and Apple Siri Distract Drivers
Life is not about eliminating all risk. It is about managing risk in an intelligent manner. Driving is by itself very dangerous, so if we undertake the minimize risk, we should not drive at all. My point above was that it is very possible to intelligently and carefully use a cellphone and drive - just as it is possible to listen to the radio and drive safely. I am sure that studies would show that radios cause distraction as well. That is not saying that everyone will use a cellphone safely - on that I certainly agree!

Comment: So does scratching your nose (Score 2) 208

by cjonslashdot (#48082737) Attached to: Studies Conclude Hands-Free-calling and Apple Siri Distract Drivers
And I would rather be a tiny bit distracted, at a safe moment when I make sure that I have plenty of car lengths in front of me, than be lost, wandering around trying to find my way. The maps application is one of the best driving innovations every. And Siri is fantastic, in that you don't have to fiddle with an address book on your car's console - you just say, "Call Joe". To me, it _enhances_ safety. And for those who think that I should not talk and drive, then remember the times that you were running late, and felt the need to rush, whereas by calling someone and saying you are a little bit late, you remove the pressure and you can slow down.

Comment: Re:Ain't no body got time for that (Score 1) 606

by cjonslashdot (#46341687) Attached to: 'Google Buses' Are Bad For Cities, Says New York MTA Official

I agree.

I commuted into Washington DC for a year and it was hell. The noise, and just walking on the sidewalk was stressful, with the traffic and congestion and all the drivers in a horrible mood because of it. And when using the metro (subway), I would have to deal with sleet and snow and rain and walking long distances from the metro stop to my destination, avoiding cars and buses and horrible weather, usually with the stress of being on the verge of being late because commuting took such a large chunk of my day.

Today I have a really nice house on a lake in a suburb. I could not have a home like this in a city - it would cost tens of millions of dollars. I can walk to the store if I choose (or kayak there), as well as kayak on the lake for exercise (which I do several times a week), and bicycle on a nearby path with no cars and lots of quiet and beautiful scenery. And nowadays I have a very pleasant 20 minute commute to my job in a suburban office - on the ground floor with windows and my car parked right outside instead of me tucked away up in some high rise prison.

Why anyone would want to live or work in a city mystifies me.

Comment: Re:Because we are stuck in an imperative paradigm (Score 1) 876

by cjonslashdot (#46205149) Attached to: Ask Slashdot: Why Are We Still Writing Text-Based Code?

The rationale for a graphical approach is that one can visualize concurrent behavior more easily.

Structure is also easier to perceive if it is visually apparent - unless it is cluttered up with detail level features, as you have pointed out. So to be able to perceive large scale structure, one must "factor" low level features into larger, more conceptual structure.

I agree that software tools on the market for "visual programming" are not what we want here: you are completely right about those tools being suitable only for simple programs. I am talking about the class of models/designs that represent concurrent behavior, including event oriented designs.

For example, consider a system that is designed completely around events. If one can visualize the inter-connection of components, and one annotates each component with the types of events that it generates (perhaps using expressions), one can easily grasp the overall behavior much more easily than one could if that same design were expressed textually. Thoughts?

PS - I was on the team that designed the VHDL language at Intermetrics circa 1984, and I built the first synthesis compiler for VHDL, so I definitely appreciate your points. I am just not convinced that we are not on a path that is a result of the dominant computing paradigm (imperative programming), rather than the optimal path... I have been out of the HDL field for a long time though, but I often build simulation models of complex systems, and a visual paradigm seems to really help with analysis. That is what makes me wonder about visual approaches to design. I have also wondered about the fact that software programs - created using an imperative paradigm - are so buggy, whereas electronic systems tend to be far more error free. What are your thoughts on why that is?

Comment: Re:Because we are stuck in an imperative paradigm (Score 1) 876

by cjonslashdot (#46198079) Attached to: Ask Slashdot: Why Are We Still Writing Text-Based Code?

Interesting. Thanks for the clarification.

I can see that low level schematics would be hopelessly complex to wade through, and would be missing "intent". But what about design level?

E.g., suppose that the design was based on an event paradigm? In that case, the signals are high level flows of information, and the events triggering the signals are not electrical level, but rather are at a conceptual level such as "task A completed".

I wonder if an even based programming paradigm would lend itself to a more graphical approach.

Comment: Re:Because we are stuck in an imperative paradigm (Score 1) 876

by cjonslashdot (#46195031) Attached to: Ask Slashdot: Why Are We Still Writing Text-Based Code?

Is it really true that IC are designed with HDLs? I worked in that field during the '80s, so things might have changed; but back then, graphical tools were used, and then the final design was documented using an HDL. I.e., the HDL (e.g., VHDL) was a final generated output, but was not used during the actual design process.

Good point about symbols documenting intent. Symbols enable a comment to refer to something else, e.g., "module ABC".

Comment: Because we are stuck in an imperative paradigm (Score 1) 876

by cjonslashdot (#46192437) Attached to: Ask Slashdot: Why Are We Still Writing Text-Based Code?

It is because there are millions of programmers who are experienced using an imperative programming paradigm, and that keeps it going, because imperative constructs lend themselves to textual form.

It is true that if one were to create graphical equivalents for current programming languages, those graphical languages would be cumbersome. One has to think beyond that:

E.g., an event-based programming paradigm is much more powerful than an imperative paradigm, but event based programming is hard to understand when expressed textually; but it is easy to understand if expressed graphically. And that is why concurrent systems - electronic systems - are designed graphically.

And they tend to be relatively error free, compared to imperatively written programs. Complex chips can be designed with few errors, whereas imperative software code tends to have lots of errors.

A graphical language obviates the need to define symbols. Symbols are only needed to cross-reference things; but in a graphical language, you just connect them. The fact that all communication pathways are explicit means that there is no need to control "aliases", and that makes the design process inherently more reliable, and it lends itself to simulation.

Comment: It doesn't work like that (Score 5, Informative) 365

by cjonslashdot (#45901225) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
It's about more than gates. It is about registers, ALUs, gates, and how they are all connected. There are many different possible architectures, so it depends on the design: some designs are faster but take more real estate. There are algorithm-to-silicon compilers (I know: I wrote one for a product company during the '80s and it is apparently still in use today) but each compiler will assume a certain architecture. I would recommend one but I have been out of that field for decades.

Is a person who blows up banks an econoclast?

Working...