Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Only Apple (Score 1) 624

It's not exactly great, but it's not all THAT bad. After all, it comes with a webcam and USB ports. Either one of those minor features alone add a huge boost in use value (I'd love to videochat while chilling on my couch with someone, then bring the tablet over to the kitchen and prop it up while I do dishes and keep talking! and the value of even a single USB port is pretty much self-evident!). Having both of these features for less price than an iPad is not something you can honestly ignore or dismiss.

Basically, it's 2 different products from totally different origins entering the same space: both are entering the above-smartphone-yet-under-netbook space. Both iPad and Eee are both awesome in their own way.

tldr: competition = yay!

Comment Why did it fail? (Score 1) 203

Nice article that made me very curious about one thing: why did the Newton fail? It seems like an amazingly useful and cutting-edge device that should have been snatched up by everybody.

Maybe it was just a little bit TOO new, so didn't fit well enough into people's existing workflows?

Comment Re:Translation: Massive Union Vote Buying Program (Score 2, Insightful) 801

Culture is both a cause and a consequence. Parents might magically wake up and start talking to their children about the wonders of science. Or they might not. But culture can also be another tool in the government toolkit (e.g. religion).

Your fantasy of an evil controlling nanny state versus the rebellious freedom-fighter parents is just that: a fantasy. Sometimes the government is doing something that should be done, and sometimes not; sometimes parents are doing what should be done, and sometimes not. I think everyone recognizes by now that most people don't spend time exciting their kids about science, and so the ability to reason and think clearly is declining. Hence, it's forward-thinking for someone with the power and responsibility of the President of the US to increase science and mathematics education. I don't see how any clear-thinking person could be against this.

Simplified models beget simplified thinking.

Comment Re:Put lesson plans on TurnItIn.com (Score 1) 590

Begging your pardon for a moment, but is it not the point of university education and student teaching to provide exactly what a teacher needs to be able to do their job, and to adhere to lesson plan guidelines from state agencies and national standards? This is what I remember essentially being the case.

Again, I must reiterate: for-profit education reduces incentive to widely disseminate information. We frequently talk about open source software models being profitable not because of the content but because of the necessary services to implement it in practice. Why not the lesson plans too?

Comment Put lesson plans on TurnItIn.com (Score 1) 590

After all, if a student earns a grade for their own unique academic paper, shouldn't the teacher be required to earn their dollar for their own academic lesson plans or be penalized for it?

Reducing education to a financial transaction either needs to work both ways, or work neither way. If the teacher can buy a lesson plan and tailor it to their classroom, a student should be able to buy a paper and tailor it to their specific need too. It's an absurd example, but one that illustrates that all parties in education need to adapt to each other and not reduce things to a dollar sign and marginalize society's most important equalizer.
Caldera

SCO Terminates Darl McBride 458

bpechter writes "Linux Today reports SCO has terminated Darl McBride and linked to the SCO 8K SEC report. The report found also at the SCO site and states: 'the Company has eliminated the Chief Executive Officer and President positions and consequently terminated Darl McBride.'"

Comment Oops...one critical mistake I should point out... (Score 1) 301

I meant to say that in my first paragraph that Verilog has both procedural and concurrent structures, and that its C-like syntax tends to push people to use more procedural constructs rather than concurrent which lead to gross compiler assumptions and/or non-synthesizability. In order to avoid confusion, I would therefore suggest that the strong typing in VHDL makes it easier to understand digital design in the context of an HDL. Sorry about the confusion.

Comment Not in the context of FPGA/HDL synthesis it's not (Score 2, Interesting) 301

You're right that Verilog has those constructs, but they're strictly used for modeling. You either won't make synthesizable code out of them, or if it handles them it's done in an implicit way that you absolutely have to know what the implications are. Again, HDLs are not programming languages in the get-to-the-chip sense, they're concurrent systems description languages. Even more reason to leave Verilog alone at the outset and learn with VHDL.

Comment Advice from a former instructor of VHDL and FPGAs (Score 5, Informative) 301

It's been about ten years since my TAs and I taught the lab section of the advanced digital logic design at my university. I agree that, generally speaking, VHDL is a better teaching language than Verilog. Part of the reason is that Verilog, being much like C, is inherently procedural. You don't want to think procedurally with digital logic except for the specific case of state machine design, and even then you have to take into account concurrency. It is this fundamental aspect of concurrency in HDLs that is key to being able to design effectively. I can define twenty clocks going into counters, just like I can wear twenty watches on my arm and have them all tell time independently and/or at different speeds. You can't really do that with procedural languages unless you're talking about thread scheduling, and then this becomes a thread scheduling exercise when you have multiple threads. Even then, you will never be able to get the speed of digital logic because you have instruction fetch, instruction decode, etc. that introduce latency that cannot be reduced even in a multi-core CPU. Not thinking procedurally will help, and the strong typing of VHDL over Verilog will help greatly in my opinion. Those Karnaugh maps you talk about are fine to learn, but HDLs use case statements in VHDL that make state machine design trivial especially when you have >8 states.

Beyond HDLs, however, are FPGAs and ASICs (and I've designed using both). Putting the differences between FPGA and ASIC aside, FPGA has some very specific ties to the vendor because of the way the FPGA is architected. Assignment of I/O, synthesis, and most of all timing constraints for guiding the "map place and route" tools for FPGAs are something you won't learn from VHDL alone (e.g. clock domain frequencies, max/min delays, input/output delays, false/multicycle paths, setup and hold times or worst-case timing paths in the design). These are essential to digital design, but not part of the HDL at all (see Synopsys SDC format for more info). In fact, shell scripts, sed/awk, Perl, TCL, Scheme and Python are also essential to know because they glue the various different tools together through scripting, processing of text files, tailing log files, and batching can be critical to being efficient. So is being thorough in understanding log file warnings and errors, timing reports. Electronic Design Automation or EDA tools also have their own idiosyncrasies, and you'll need to develop a stable "reference front-end and back-end design flow" if you haven't already. Do you use an Altera or Xilinx reference board, or an add-on PCI-based FPGA card? And how do you analyze what's coming and going at the interface? All of these questions need to be answered before you really get going on FPGAs. ASICs have an order of magnitude more complications for reasons I won't even discuss, but it just gets harder. So those state machines that you created without K maps will have synthesis pragmas that direct the compiler to create the appropriate state machine (e.g. One-hot for performance, Gray code for lower power, etc.).

Finally, there's the work world. As other posters have mentioned, North America is primarily focused on Verilog while the rest of the world is VHDL. Most synthesizable IP cores for various functions come as Verilog. So, the truth is, you should know both major HDLs, but you would be better off being proficient in Verilog in the real world for the simple reason that it is the present and future (or at least its successors, such as System Verilog, are the future) are for many reasons. Also, in the work world, it's critical to know the major EDA vendor software and to put it on your resume (i.e. Mentor Graphics, Synplicity (for FPGA), Synopsys, in roughly that order, and Cadence and Magma for ASIC) as well as all those scripting and other languages like Perl and TCL that I mentioned. Don't completely ignore VHDL, however.

As an ironic point, there are SystemC compilers for hardware that are becoming more and more crucial in large scale development for video algorithms and the like. The results tend to be difficult to formally verify (i.e. C code != Verilog out) and often inefficient or even not realizable in physical design, so you may need to go in and modify pieces heavily to be able to close timing. At your level and given the class of devices you're working on, it's quite premature to consider this, and I would strongly recommend to focus on the HDLs first. In fact, I don't even know if there's a university program for that software either.

My best advice here is to focus on learning digital design and VHDL first, then Verilog, then seeing how it applies to an FPGA kit (e.g. Altera Cyclone or maybe even MAX CPLD, or Xilinx Spartan) from a major vendor, then learn some of the other industry-standard EDA tools that work with their tools (e.g. Synplicity FPGA compiler, Mentor Modelsim waveform simulator for test benches and back-annotated (i.e. timing-aware) simulation, possibly some formal verification tools as well). This topic is way too big for even this post, and I feel like I'm swatting at a cloud of flies trying to get rid of them, but knowing FPGAs will greatly help your career since advances in process technology are making FPGAs more SoC like and cheaper all the time.
United States

Biden Reveals Location of Secret VP Bunker 550

Hugh Pickens writes "Fox News reports that 'Vice President Joe Biden, well-known for his verbal gaffes, may have finally outdone himself, divulging potentially classified information meant to save the life of a sitting vice president.' According to the report, while recently attending the Gridiron Club dinner in Washington, an annual event where powerful politicians and media elite get a chance to cozy up to one another, Biden told his dinnermates about the existence of a secret bunker under the old US Naval Observatory, which is now the home of the vice president. Although earlier reports had placed the Vice-Presidential hide-out in a highly secure complex of buildings inside Raven Rock Mountain near Blue Ridge Summit, Pennsylvania, Fox News reports that the Naval Observatory bunker is believed to be the secure, undisclosed location former Vice President Dick Cheney remained under protection in secret after the 9/11 attacks. According to the report, Biden 'said a young naval officer giving him a tour of the residence showed him the hideaway, which is behind a massive steel door secured by an elaborate lock with a narrow connecting hallway lined with shelves filled with communications equipment.' According to Eleanor Clift, Newsweek magazine's Washington contributing editor 'the officer explained that when Cheney was in lock down, this was where his most trusted aides were stationed, an image that Biden conveyed in a way that suggested we shouldn't be surprised that the policies that emerged were off the wall.' In December 2002, neighbors complained of loud construction work being done at the Naval Observatory, which has been used as a residence by vice presidents since 1974. The upset neighbors were sent a letter by the observatory's superintendent, calling the work 'sensitive in nature' and 'classified' and that it was urgent it be completed on a highly accelerated schedule."
Earth

Were Neanderthals Devoured By Humans? 502

Hugh Pickens writes "The Guardian reports that a Neanderthal jawbone covered in cut marks similar to those left behind when flesh is stripped from deer provides crucial evidence that humans attacked Neanderthals, and sometimes killed them, bringing back their bodies to caves to eat or to use their skulls or teeth as trophies. 'For years, people have tried to hide away from the evidence of cannibalism, but I think we have to accept it took place,' says Fernando Rozzi, of Paris's Centre National de la Récherche Scientifique. According to Rozzi, a discovery at Les Rois in south-west France provides compelling support for that argument. Previous excavations revealed bones that were thought to be exclusively human. But Rozzi's team re-examined them and found one they concluded was Neanderthal." (Continued, below.)

Comment Two changes that could've been made (Score 4, Insightful) 852

1. Less talk and more subtlety. This means very little or no explicit dialog, no in-your-face pictures of dancing robots (but maybe Baltar and Six in front of an electronics store), and Jimi Hendrix's version of All Along The Watchtower playing on some radio in the background of some guy on the street. As it stands, it was too overt and tried too hard to make its point for viewers already accustomed to needing to think a bit more.

2. What probably would've happened after Lee recommended all technology go away is a split between those who still wanted it and those who didn't. The two sides would create a pact to keep separate from each other, the small minority of technology-loving people going to live on a small continent off the west coast of Africa... Said continent, of course, to have been destroyed at some future point in time by natural disaster and essentially all technology along with it. This would solve what would be an obvious dilemma and split in viewpoints of the remaining people while reasonably explaining what would've happened to their technology.
Government

Obama To Launch Website For Tracking Tax Expenditures 358

internationalflights tips news that Barack Obama, in his first weekly address as President, has mentioned plans to set up a website for tracking "how and where we spend taxpayer dollars." Details about the website, Recovery.gov, are available within the American Recovery and Reinvestment Act of 2009 (PDF). The website "shall provide data on relevant economic, financial, grant, and contract information in user-friendly visual presentations to enhance public awareness of the use funds made available in this Act," and will also "provide a means for the public to give feedback on the performance of contracts awarded for purposes of carrying out this Act." The site itself currently contains a placeholder until the passage of the Act.

Comment Re-type them and post them anonymously (Score 1) 931

Seriously, how is she going to track this down? If you're afraid of being found out, post it to Wikileaks where they are beyond any court order. If she tries to pull anything on you, tell her that she needs to prove it was you, and if she can't that the university will be on the financial hook for it (i.e. back off).

As a former lab instructor, my job was to share my knowledge with students, not to prevent them from taking it with them. Hard-ass instructors like this just pissed me off because they think people won't show up to their lectures if they have their notes. There's no better way than to return the favor than to do exactly what they tell you not to.

Slashdot Top Deals

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...