Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:specification & testing (Score 1) 52

You can do software engineering in any language, even C or Visual Basic. But it's easier with languages that provide more precision in specification and in behavior. Once you get used to it, strong typing, particularly for scalars (integer types) is your friend. It can catch at compile-time subtle bugs that can be hard to find. Similarly, languages that make it easy to define chunks of code with well defined interfaces/APIs (and that do information hiding on details) is a huge advantage in large (as in 'many developers') and long-lived (as in 'many maintainers') systems. In large systems, code is read -much more frequently- than it is written. And that a good API specification should help the user/reader understand what happens when things go right, and what happens when things go wrong. Even in Ada, there are a few things in the language specification that are 'undefined', which means "we have no clue what a program that makes this mistake will do."

Comment Re: specification & testing (Score 1) 52

I've been doing software engineering for over 20 years now ans, just like unicorns, I'm yet to see a specification document which does more than scratch the surface.

Then clearly you've not worked in a safety-critical regime, where specifications are required to fully cover behavior sufficient to generate test cases that cover 100% of the requirements, and by extension 100% of the code branches.

But even in "regular" development, the API specifications I wrote, implemented and tested were all expected to be sufficient to be useful for test and verification.

Frankly, if you haven't seen this level of specification, I really question if you've been doing "software engineering", or just hacking.

Comment "influencers" citing other "influencers" (Score 1) 79

"Sources familiar with the situation" make shit up all the time. Apple production numbers and expectations are tightly held, but people still believe "media influencers" who look around and say "Huh, we can collect clicks by generating some FUD around Apple." In this case, we see is one "media influencer" citing another "media influencer" using their "proprietary algorithms" (usually involving rectal sampling.)

When you're in a yearly production cycle, there's ALWAYS production cuts after the product release, you produce leading up to release, and then move production to The Next Thing.

Comment Re:Also, don't trust human Reddit answers (Score 1) 70

Uh, yeah. That's exactly what I thought when I saw this.

All Social Media is full of 'influencers' who don't know what the fuck they're talking about. The potential benefit of a (wrong) AI answer is that you -might- be able to query the AI to find out how why it said that. Of course, the LLM is likely to say "That's the most common content from Social Media."

Comment Language Popularity studies were always bogus (Score 1) 51

What exactly do they measure? SLOC produced? Job Postings? GitHub commits? Articles in magazines? Job openings that mention the desired language(s)?

(For the latter, I remember a company using job postings in the LA Times, their local ppaper, as their "independent scientific" rationale for selecting their preferred programming language. I responded, "By that measure, shouldn't you be developing this system in HTML, since that's listed as a programming language in the job openings you surveyed?" Then I said, "Look, if you want to pick a specific language, just do it. Don't -pretend- you did an independent language selection process.")

Comment "There is nothing wrong with your browser" (Score 1) 80

There is nothing wrong with your television. Do not attempt to adjust the picture. We are now controlling the transmission. We control the horizontal and the vertical. We can deluge you with a thousand channels or expand one single image to crystal clarity and beyond. We can shape your vision to anything our imagination can conceive. For the next hour we will control all that you see and hear. You are about to experience the awe and mystery which reaches from the deepest inner mind to the outer limits.

https://www.quotes.net/mquote/...

Comment Faster HW sometimes fixes SW problems (Score 1) 62

As others have pointed out, embedded software is hard, real-world timing problems are unforgiving. Now one possibility that occurred to me is the problems may be related to timing managed by the operating system. If they're using a typical cyclical executive with timeframes allocated for each task, and some tasks can't be completed within its allotted timeframe, then a faster computer would fix those problems. Comms protocols, in particular, often have timing constraints where if you can't complete the task, the protocol times out and the operation fails. Doug Jensen defined 'hard real-time' as a situation where the value of a computation after a specific time is zero. That includes the situation where the computation cannot be completed within the given time window.

Sure, you could argue that's a repair for under-specified, under-designed, and under-tested software, and I would agree. .

Comment Re:This is one symptom of a deeper problem (Score 1) 243

I disagree. This is much more a consequence of "Moore's Law" applied more generally. USB-C has much higher bandwidth and adds significant power distribution, at the cost of new connectors. Thunderbolt 4, using the same connectors, has even more bandwidth. But at some point, we could well see 'copper' replaced with 'fiber' or 'composite' cables, with fiber for data and copper for power.

That being said, it looks like USB-C and HDMI as form factors/connectors will provide a reasonable lifetime for connections.

Comment Re:History (Score 2) 39

Capability machines and tagged architectures have an interesting history. The first I know of is the Burroughs 6600 from the mid to late '60s: https://en.wikipedia.org/wiki/... Then there's the Intel 432 https://en.wikipedia.org/wiki/... and the short-lived BiiN system that was a successor to the 432: https://en.wikipedia.org/wiki/... But a lot of this dates back (like so much of computing) to Multics and its mandatory access control mechanisms.

Slashdot Top Deals

Is knowledge knowable? If not, how do we know that?

Working...