Slashdot videos: Now with more Slashdot!
I need to do I/O. So I use an existing class. Which is, more often than not, bundled into a large library of classes. And the classes in this library depend on classes in that other library. So I need my new class, and the classes it uses and the classes they use, all the way down.
If I'm writing an app which uses data persisted to/queried from a SQL database, there are libraries (for Java, usually
By the time you write something for a website which will use Hibernate to handle the ORM duties, Spring framework to handle the web duties, etc. you end up with an application that's > 10 MB in size. The code which I, actively, wrote is
Furthermore, if I choose to extend that other class, I may create a method which overrides a method in that class, and my method may not use theirs at all. As such, I still have to include their class, and its associated library, IN ITS ENTIRETY, but parts of it not only aren't being used but aren't even REACHABLE. Because calling foo.bar(), where foo is an object of my class, can't even reach the extended class's
Object Oriented programs are built in layers, by accretion. As the number of layers increases, the amount of unused and unreachable code grows. Until, voila! we're down to 5% of the code in the app actually getting used, even if you exercise ALL of the functionality in the app.
Java is merely one of the worse offenders in this. When you have a stacktrace with 30 layers in it, that's a LOT of code. A significant fraction of which is NOT getting used by your app.
Who wrote this crap? Oh, I did. Was I ever that lame? Seriously? WTF does this code do? A one-line comment would've helped.
Eagleson's Law definitely applies.
Especially as you age. I'm pushing 50. I've become the king of docs, both in the code and on the wiki. Because I've learned that there's no guarantee that I'll remember how to do that obscure thing six months from now when I need it again. And if someone else benefits from my docs, that's just icing on the cake.
Wireless bandwidth is limited by the allocated spectrum. With landlines, you can always drag more fiber or copper, hook it up, and expand your bandwidth. You can't do that with wireless.
No, but you can:
- install more towers
- reduce the power output/coverage on the existing towers, creating smaller cells
- re-use the bandwidth you've already been allocated, in smaller cells
This is how wireless carriers increase bandwidth. There's considerably more bandwidth available, per square mile, in a city than in a rural area. Not because they have more spectrum in the city. But because each tower services a smaller cell.
It's slightly more complicated than that; adjacent towers need to use non-overlapping spectrum, permits, backhaul connectivity, power. Yeah, it's expensive. Of course, it's expensive to drag more fiber or copper, too.
I've commented multiple times about hydraulic hybrids. I like them, relative to electric hybrids, because they have a very high power density. I like the acceleration that power brings. And 1,000 charge/discharge cycles is hard on batteries but pretty much a normal day for hydraulics.
It did not go well.
While I had a good understanding of the basics, and I could do bitwise logic and such (courtesy of my assembly language and machine language experience), I found myself struggling. Hard. It took me a while to get stuff working, because I had to "feel my way through" on most everything and I was severely handicapped in how complex the code could get before I was lost.
I eventually went back to college and got the Computer Science degree.
Being able to program is a useful skill. But if you don't know enough theory to handle relational databases, trees and other fairly complex data structures, you're hampered right out of the gate. Yeah, that's theory. Being able to code a balanced tree is useful; understanding when you do and DON'T need that data structure is more so.
Additionally, I don't get where they're saying these degrees are all theory. I had to write a pile of assignments in C++ during my college studies, as well as learning enough Scheme, Java and MIPS and x86 Assembly Languages to write assignments in those languages. That's practical, hands-on development, gaining experience with the language and its associated APIs. Additionally, if you do an internship somewhere while you're in college (I didn't, but I've managed/mentored an intern or two, now that I'm an experienced dev), you have hands-on experience with more than just a programming language.
Every company does things a little differently. Different standards, different conventions, different infrastructure. Ergo, it is PHYSICALLY IMPOSSIBLE for someone to walk in, with no existing experience with that company, and go right to work, being productive for the employer on day one. Yes, Human Resources and management keep indulging in that pipe dream. If only the schools would teach THIS, not that. If only they'd spend less time on math and more time on the finer points of this framework. Failing to realize that they'd be chopping out useful theory which could (and quite frequently, will) be useful down the line.
Besides, if you were trained in EXACTLY what this company needed, you would never be able to jump ship to another employer. Too many employers keep finding excuses not to provide raises that keep up with the cost of living. The only way to keep up, these days, is to jump ship every few years. And your next employer will need stuff the last employer didn't need. So, getting trapped in a pipeline which is heavily customized for one employer is bad for your long-term prospects.
Your second question suggests a basic understanding of supply-demand. Good.
As the demand for lithium increases, the price WILL go up in the short-term, which will stimulate investments in creating supply. We're already seeing the ramp-up in supply coincident with the ramp-up in demand. A few years ago, Electrovaya was advertising Lithium Ion batteries, large-format, for $300 / kWh in volume. That's what Elon Musk claims to be paying for his batteries, today. But the demand is at least an order of magnitude higher. We've already passed the "hump." And Elon Musk's investment, with Panasonic, in the "Gigafactory" is intended to push the supply higher, pushing the price lower, on the cheaper side of that hump.
A sudden, HUGE spike in demand could create another hump, but most manufacturers are sensitive enough on price that they will probably avoid it.
When you go to fill up your car, you have a choice of where to fill the tank. All too often, all of the gas stations in an area have the same price. Or if one is slightly cheaper than another, there's some other factor that "evens" them out.
The problem with modern email systems is that the emails are stored in plaintext. Some systems may use site-wide public/private key encryption but, if a third party gets access to the site's private key, everything is, effectively, plaintext.
So how do we fix this?
Do all encryption/decryption on the client. The client holds the private keys. The server has everyone's public keys. All traffic and stored data is, by default, encrypted.
JPMorgan Chase makes their money skimming a percentage off financial transactions. And gambling (let's face it, that's what 'investing' in the modern stock market really is). And, all too often, rigging the gambling (high-frequency trading, anyone? Mortgage Backed Securities, which they sold KNOWING that the valuation was a flat-out lie?) What's the utility to society?
None. So we get kinda up-in-arms when we see people getting obscene amounts of scratch for such grifting.
The sooner a tech company can automate getting/paying loans (the one useful thing which JPMC does), the better. The sooner tech companies can create the kinds of financial networks which undercut Mastercard, Visa et al. with similar utility and lower rates (less "skim"), the better. At that point, the only thing keeping dinosaurs like JPMC in business will be regulations REQUIRING human interaction on certain transactions, put in place at the urging of JPMC-paid lobbyists (such regs already exist).
That said, not all tech companies provide utility. So being a tech company doesn't mean you're automatically off-the-hook.
Properly maintained, an airplane will last much longer than a car. But even an airplane hits "tired iron" status after some decades.
Your 30-year-old car is involved in a crash. You die. Do your relatives sue the car company? Probably not. It's an old car. Most cars don't last 30 years. The auto manufacturer is perceived as being absolved of any liability, long ago. Three years, the manufacturer may be liable. Most people won't find an auto manufacturer liable for a 10-year-old car, much less 30.
Your 30-year-old airplane crashes. You die. Do your relatives sue the aircraft manufacturer? Probably. And, quite frequently, collect. Consequently, Cessna, Piper etc. are largely on the hook for EVERY AIRPLANE THEY EVER BUILT, even ones from decades ago. At least one company was talking about leasing their aircraft, not selling them, and pointedly destroying them after so many years of flight time, so that they could limit their liability.
I know, I know. It's fashionable to blame everything on federal regulations. And yes, they are pretty strenuous. But the financial liability is, quite literally, sky-high.
Then, of course, there's the fact that they don't build as many airplanes as they do cars; they never did. Which means they don't get to amortize their R&D across as many produced items.
The difference is that, for Japanese and Chines, you had to do that for desktop PCs and laptops because there was no way to represent the thousands of symbols available on any keyboard.
As touch-based input systems become more common (smartphones/tablets), there's the possibility to have a 'finger painting' type input system, where you draw the symbols. In that respect, eastern languages would be better suited to text input on such devices than us westerners.