And what, pray tell, has that scandal got to do with SSNs?
And what, pray tell, has that scandal got to do with SSNs?
We already have quite a cashless economic system in a lot of the world. How is it bad I wouldn't know. To me, printing paper money and minting metal coins is just a retarded waste of resources that has no place in 21st century. As for the "mark of the beast" - I don't know what the heck are you talking about, civilized places all have national people registries that are used daily by many branches of the government. Such systems were in place for decades, at least in Poland. While I might not be happy about what taxes are spent on, I do appreciate the fact that there needs to be a key that allows to mark and track everyone's tax payments. What's wrong with that?
I don't know what it'd do for China, but it'd do several immediate things to what many Slashdotters care about.
1. It'd suddenly become not only viable, but necessary, to repair PCs. For money!
2. Small manufacturing shops would start popping up left, right and center. There'd be a real rebirth of manufacturing. Yeah, it'd be a bit in the style of the late 19th century, with increase in pollution etc., but for a while there'd be a lot for everyone to do.
3. Every municipality with a city dump would see trash volumes go down by 10-20% at least.
4. Mining baby. Yeah, mining would be back, big style.
Well, if political convenience is what you call madness.... Hey, wait, it is. I'm far from being positive about what Snowden did, but there's a line that shouldn't be crossed - and that is respecting the sovereignity of a country. Russia or any other country is free to do here as they see fit, neither US nor anyone else got nothing to it.
Well, of course I goofed, it's not that easy (well it is, read on). A snapshot keeps track of what has changed, yes, but it records not the new state, but the old state. What you want to transfer over is the new state. So you can use the snapshot for the location of changed state (for its metadata only), and the parent volume for the actual state.
That's precisely what lvmsync does. That's the tool you want to do what I said above, only that it'll actually work
I'd think to use LVM and filesystem snapshots. The snapshot does the trick of journaling your changes and only your changes. You can ship the snapshot over to the backup volume simply by netcat-ing it over the network. The backup's logical volume needs to have same size as the original volume. It's really a minimal-overhead process. Once you create the new snapshot volume on the backup, the kernels on both machines are essentially executing a zero-copy sendfile() syscall. It doesn't get any cheaper than that.
Once the snapshot is transferred, your backup machine can rsync or simply merge the now-mounted snapshot to the parent volume.
And no, I'm not proposing to do it myself. I just think you haven't thought it all through - it only appears simple; that's why nobody has done it yet. Leveraging the LLVM front-end to extract interfaces from GTK and Qt code would be a tiny aspect of the job, done out of convenience.
I've looked at some of the papers, and I don't think that such translation would be anywhere in the same order-of-magnitude ballpark of complexity. It's a much bigger job than, say, proving some properties of SSA optimization. Of course I'm only an amateur and whatever I know about programming language semantics is self-learned, but I just don't see how it's anywhere near "easy" if you want to do it right. My gut feel is that, scope-wise, it'd be more along the lines of formal semantics of C and compcert work of Xavier Leroy et al. After all, you need some description of semantics of both the source and target frameworks, and then a way of describing or even just aiding/guiding the transformation between the two. In order to be fit for human consumption, it'd all need to be done in some sort of a declarative framework so that you don't deal with the drudgery. Well, maybe these days it's all a walk in the park for the theorem-prover-wielding yung'uns
Well, said translator would need to include a knowledge base with translation rules. Those rules would need to be expressed in some high level language, with formal semantics and all that. Presumably, to save on the drudgery, many rules would be generated from some descriptions of semantics of "similar" concepts in Gtk and Qt. It would be rather far from simple stuff like pattern matching, and even that was subject to plenty of papers on code generators I'm sure.
Done properly it'd not only be a dissertation-sized piece of work, it would require some original results to implement it in a way that's fun to deal with. I mean, come on, people do master theses with nothing but a description of a C-derived programming language and formal semantics of the same with some proofs. We're talking about something perhaps an order of magnitude harder in scope.
Sure it could be hacked together from a bunch of M4 macros, it's Turing complete after all, right? But that's not what I'm talking about.
Huge hoop, my ass.
Wait a minute, does that mean that you program without using code generation from things higher level than C/C++? I almost can't believe that you can do any big project without speeding up development and avoiding errors and the mundane by generating code in a way that's not really possible merely utilizing the template metaprogramming. Almost anything you do eventually needs a lexer or parser (data formats, comm protocols), state machines, etc. Writing those even in C++ is a big hassle, since the compiler isn't really designed to do arbitrary computation at compile time, and that's ultimately what you need in order to be productive. Yeah, you can do it all by processing bytecodes at runtime, but that bloats your product with fragments of tools that are not needed past build stage. Bloat is bad.
It's really silly IMHO to insist that the make utility, compiler, assembler and linker are on a pedestal, and you're OK using those, but no other tool could ever come into the picture because it's somehow unkosher. Get over it - moc is a code generator, like many others that you're supposed to be using to stay productive. LLVM has got tablegen for a reason - good luck implementing it using template metaprograms to run during C++ compilation, ha ha. Are you going to bitch that LLVM hasn't ever really been C++, too?
Template metaprogramming quickly degenerates into mental masturbation where you express things that take a few lines of easily comprehensible imperative script in a convoluted mess of declarative functional code that makes people who do real functional programming in Haskell or OCaml feel quite happy at not having to deal with C++.
Do you really believe that there is any benefit in re-expressing simple code generators into dozens of template metaprogramming headers that not only slow down the compilation, but are a royal pain to debug should things go wrong? I consider eigen, for example, to be a fairly top-shelf example of what it takes to get C++ to produce well performing numerical code. It's a huge freakin' clusterfuck, and some things still can't be done due to limitations of either C++ spec or of implementations.
I am, in a way, happy about people "disliking" moc. I expect they have a similar religious aversion to other code generating tools. That only makes me more productive and more competitive. So, yeah, thank you.
Signals and slots can be done in C++. Introspection - can't. Moc is not about signals and slots, it's about introspection. Introspection data is then used for signals and slots, but that's just a happy coincidence, because introspection is used for other things too. I'd have hoped people would actually look at the code without repeating the tired old non-arguments. You know, it's pretty damn easy nowadays, no need to download/unarchive the tar ball.
Introspection is not overrated. It's what you need to do the things that are done in Qt, day in, day out. Qt nowadays supports both compile-time-checked and runtime-checked bindings, something that you don't get without introspection. Libsigc++ only solves the compile-time end of the deal.
I know the value of a hack that works, but there's plenty of such hacks that are TDWTF material at the same time. No hate implied.
There are many ways for you to go. I've tried all of them except the last one. I love PCIe - good luck doing any of that with legacy parallel buses. In most cases to make it presentable you need to get rid of the DVD drive and replace it with an HDD caddy. There's room left in those letting you add your own hardware - plus you get chassis to bolt/hot-glue your hardware onto.
1. On most any macbook/macbook pro you can replace the built-in mini-PCIe wireless adapter with a serial card. Such cards do exist. Some case mods and soldering are needed to bring the wires out, but you can have native serial ports on it no problem. Been there, done that, the mod even looks presentable. I managed to fit it where the DVD used to be. From outside it looks as if Apple have put it there themselves. The DVD has been replaced with a HDD-carrying caddy anyway, so it was easy to cut out some of the caddy. I've used a mini-PCIe splitter so that I've retained the wireless functionality.
2. On a machine with neither expresscard nor thunderbolt, you can use a mini-PCIe extender to get to an external card cage where you can plug in to your desire. You'll need an intermediate bridge right at the boundary of your laptop's case. That way you plug in the extender into a slot/connector on the case, not on the motherboard.
3. On a machine with expresscard slot, just get an expresscard-to-PCIe-cage extender. You can plugin multiple cards there.
4. On a machine with thunderbolt, get a thunderbolt-to-PCIe-cage extender. Many cards can go there.
5. One could certainly lay out a board with thunderbolt-to-PCIe bridge chip, and a PCIe UART. That way you'd have a small form factor hardcore serial port. I haven't found one yet.
By fresh battery do you mean a brand new battery? That's a crappy battery life for sure. I have an early '08 MBP with 6GB of RAM (maxed out), an SSD and HDD in place of optical drive like you do. On a new battery I can pull 6.5hrs of work when using a piece of legacy IDE software on Windows XP running under Fusion, with backlight on minimum (on a transpacific flight). Running Qt Creator with an occasional recompile I can pull close to 8hrs. The HDD is not spinning at that time of course.
And how is this GNU userland helpful on a tablet, pray tell?
If computers take over (which seems to be their natural tendency), it will serve us right. -- Alistair Cooke