OMG I only hope it's true!!!
I have been sticking with my old MBP (late 2013) until Apple finally rid itself of Ivy!!! He was a good designer, but in many ways his insistence that things get smaller and smaller really did not help Apple at all. When they lost all the ports, I was SO bummed! Suddenly you had to buy all these expansion ports, but the one that killed me was ditching that magSafe charging port and I was thinking, yet another reason to never buy another MacBook.
Here's hoping it's true!!
Exactly
The issue is that small discreet processors catch and resolve probably 80 to 90% of everything that happens in an automobile, from anti-lock brakes ( which have their own little processor, which might well be a device that takes input from the transmission ) to detecting when your windshield washer fluid needs refilling.
These peripheral devices all communicate over the CAN buss and or the LIN buss for lower priority issues.
The automotive world does not want one overriding processor that controls everything. A car or an airplane is not a computer, it is a device that has many sub-systems that interact at many levels. When they work together, it is sublime.
Give this a read... https://en.wikipedia.org/wiki/... it will make your understanding deeper.
That waits for minutes if not hours to do anything just don't understand
If they did the next few sentences would make sense.
Let's say that we build every system in a car, to talk to one (1) chip. Let's say it's Apple's M1 chip, or Intel's Core H series. Either one, it doesn't matter.
How would you connect all of the inputs? I doubt there enough I/O ports on either chip. Not logical, but physical I/O ports.
So even if you could magically get all these different systems to connect directly to either CPU, how in the hell are you going to code that shit? VHDL, Verilog, or SystemVerilog? That is a VERY different programming paradigm.
When automotive engineers talk about something happening at the same time? The actually mean, AT THE SAME TIME, not in the next tick, not in the next interrupt cycle. Can a modern CPU even do that? I kinda doubt that since there is only one clock in pin, and EVERYTHING moves to that single clock. Can it look like everything is happening at once? Yes it can, but only looks that way.
My last full time job, was writing a user proof interface to Oracle in Java. It worked quite well, and when I handed it over it had no known bugs, other then the bugs in Java, that I had to work around.
Is the Dock made by Apple?
Will Apple provide support when things go wrong?
If you can answer yes to either of those questions, then I am all in.
Yes indeed.
Sitll using my late 2013 MBP with retina
I was horrified when they stripped all the great ports off
I love to code. To write small tight functions and procedures to perform work.
I have some pretty good ideas for lots of things.
I love OSS!
But how can I feed my family?
Well it could be done, but ONLY if each user cooperates.
And I am sure the established power companies will use every dirty trick in the book that can think of or dream up to fight this tooth and nail.
They will start by siting safety concerns, and if that doesn't work they will go to their well paid elected representatives to pass some sort of regulation or law making the whole process either outright illegal or put up so MANY barriers as to make it financially impossible for the regular home owner.
"Mycroft has been around for quite a few years, but it's recently gained a bit more notoriety thanks to privacy concerns surrounding data collection at Amazon and Google. Unlike those assistants, Mycroft only collects data if you opt in during setup. And for the users who do opt in, Mycroft promises never to sell your data to advertisers or third parties -- instead, it only uses it to help developers improve the product. Mycroft even uses the privacy-focused DuckDuckGo as its search engine instead of Google when you ask for information."
Oh yeah and I wont cum in your mouth either!
I have been writing software for over 30 years, and dammit I have yet to forget to free after I malloc in released code!
There are a plethora of tools to check that for you! Use them!
It seems that people want to write the most insipid, stupid code they can, and they cant seem to be bothered to look at and check to make sure memory is freed after they are done using it!
Or they want to write arrays that are endless because they are to lazy to examine, or even estimate how much memory any operation will use
Everyone loves to think that there will be plenty of memory, you can never run out! My god they are lazy AND stupid!
All the these Pile it higher and Deeper idiots keep trying to write languages that allow this, really need to get jobs flipping burgers, for all the "progress" they bring to the table.
Why oh WHY do you think then entire Linux kernel is ONLY written in C! Do you think Linus is an idiot? Do you think he is Lazy? Do you think he is stupid? Guess what! He is not any of those things! He has looked at the pile of crap that has been written for years, and do you know what? He can't find anything better!
If you are too lazy, stupid and ugly to even check if the memory you have allocated is freed, the arrays you create are checked, if you cant be bothered to look at your code, and stop being so fucking clever, so fucking slick, then just go do something, hell anything else, but stop pretending you can write smart tight efficient code.
C is the smallest, most efficient, most elegant language ever built, so go have a glass of shut the fuck up, you whiny, complaining, little insignificant lazy ass fanbois!
And as opposed to your snide, and rude remarks... Yes I have a 4 year degree, and it had lots of math. So please either reply objectively, or please don't reply, at all.
This place just isn't big enough for all of us. We've got to find a way off this planet.