She sums it up pretty well... http://judgybitch.com/2013/07/29/policing-twitter-is-dumb/
Just a couple questions come to mind:
First: What is the purpose of keeping the information? If it's just to have a record for your own sake of what and when and how much, do you even need to scan the statement or receipt or keep the original? or can having all the info imported into a money manager be enough?
I've been using Quicken for over a decade (still using Quicken 2000 actually as later versions are bloaty) to keep all my financial history in detail. For answering questions like "When did I buy that Belkin KVM switch so I can see if the warranty period has expired" searching the register is good enough as I add enough info the memos. In this example (real one from just a week ago), finding the information easily was enough, and it's to my advantage to have all the individual statements and detail items combined into larger account histories rather than parse an archive tree full of pdf/ocr files (FWIW: even this old version of quicken lets me attach scans of receipts to entries)
Second Question: In what cases is the Original Paper required as opposed to a scan? If you need to show an original statement, receipt or other document to prove some thing or get something approved, do you know when an electronic copy or reproduction is as acceptable as the original? I don't think this is an area with consistent clear cut answers yet because of its newness.
Let's take an admittedly unlikely example. You have a house but have moved to take a job out of state, and you're trying to sell the house. Some scumbag squatter moves in and tries submitting false documents to claim ownership. All the documents relating to purchase and any mortgages have been scanned and shredded. Will the courts, police, banks, city and county offices etc. give you any trouble because they are not signed originals? What if the scumbag claims you fabricated the documents (like he did) and his are the originals? What if some entities accept a scan and others don't?
I've implemented a hybrid system where different documents get scanned / destroyed at different times. I have a single card-file cabinet (Filing cabinet with half-height drawers). Paper copies of everything from the current year and previous year are kept in a drawer. At the end of each year, I take all the documents from year-1, shred most of them (assuming any need for them has past), and put the ones I deem most critical in a small box to archive.
...they'd rather see Home users use a different licensing model... something with more long term revenue for the company. One way to help such a new model would be to make the current purchase model less attractive.
nahh. That couldn't be.
As someone with some game development experience, let me throw in some observations. (*based on the specs mentioned here).
The 3.2 Ghz Power PC CPUs in the Xbox 360 and PS3 were in-order execution units. As I remember, code on the 360 typically executed about 0.2 IPC -(Instructions per cycle), sometimes worse. The very best hand optimized assembler doing tasks like video decoding could execute about 0.9 IPC once properly cached and unrolled.
AMD and Intel have decades of R&D now into out-of-order x86 execution (the x86/x64 opcodes being translated to internal micro ops), which is a major factor in their performance. Even the Power PC G5 chip devoted a good chunk of its silicon to Out-or-order execution. The 360 and PS3 CPUs - designed almost 10 years ago - traded Out of Order execution for die size and clock speed.
The specs say that the 1.6 Ghz CPUs can issue up to 2 instructions per cycle. If real world performance works out to an IPC of 1.2 to 1.6, which seems very doable, then you will see a 3x to 4x increase in the real-world rate of instructions being performed . ( 0.2 IPC @ 3.2Ghz == 0.4 IPC @ 1.6Ghz ). This doesn't take into account any efficiency gains due to the instruction set, cache, etc.
And at the same time, I would imagine it's a whole lot easier to deal with other things on the chipsets at 1,.6Ghz than at 3.2 Ghz (mature tech and all that)
It did get released in Europe by Phillips as the Vidopac G7400 / G7401 (where the markets for the videopac games was stronger)
A handful of Odyssey 3 prototypes still exist and are in the hands of collectors.
When I first watched Space:1999 season 1 in the mid-70s, one of the things they did made a big impression on me: Some of the episodes would end with something like this:
John: What the hell was that and how did we survive?
Victor: I don't know. We don't know. There's a lot of stuff in the universe that we have no idea about, and it could just as easily have killed us all. We survived due to sheer luck and not because we're anything special.
That's paraphrased of course, but compared to the tone and formula/attitude of all the other action and sci-fi on TV in that era, and it was downright subversive.
I must be getting very old. Back in the 8-bit heyday (1979-1983), Softside Magazine (for TRS-80, Apple ][ and Atari 800 users) used to have 2 submission contests that they ran in almost every issue: One line programs ( "one-liners" I think they called them) and 1K programs (program size without running = 1023 bytes or less).
The TRS-80 was probably the best machine for one-liners as a single line could be 245 or so characters long (the Atari was limited to 120 characters, but you could abbreviate some keywords, I don't recall the Apple ][ Basic line limit).
The one-liner I remember the most was a graphical version of the old "Lunar Lander" game for the TRS-80. Yes, graphical. A loop (X =0 to X= 127) created the lunar landscape, followed by a loop which updated the state machine of your ship (a single "dot" drawn with the SET and RESET commands) that factored in which keys you were holding down (PEEK of the keyboard matrix I think), and tested to see if you hit the ground with your velocity under some threshold. *THAT* single-line effort was certainly more interesting that the one presented here.
I was working 'down the street' if you will at the time, as a programmer on one of their direct competitors (Age of Empires), and it's easy to forget the circumstances we all were working under at the time.
People weren't using Templates in games for a couple of reasons - Familiarity being one, but the state of the code was new, and especially the asm outputted by compilers was often very inefficient which it was not outright BUGGY if you pushed it. Templates were still very new then.
Also, you tend to forget how slow and limited PCs were then -- Your phone today probably runs circles around not just the machines that games were run on, but the PCs used to develop them.
The System Specifications "On the Box " for Starcraft were - A Pentium 90 (or equivalent - that could be a 486-133) , 16MB of RAM and Windows 95 or NT 4.0, and a SVGA video card with 512kb or 1MB of VRAM. Think about that for a minute. These were 2d video cards, not 3D. Age had almost identical specs. A full rebuild of AoE on our Dev machines (Pentium Pro) took 15-20 minutes to make.
It was very normal to worry about saving 2 bytes, or just a few cycles of CPU time back then. So you did everything bespoke by hand, and didn't genericsize much.
And to be honest. We didn't know then what we know now. The programming practices most of my peers just do automatically today -- we hadn't developed/learned them yet. We did what we could with what we had in knowledge and tools, and shipped complete AAA games for costs in man-hours and dollars that seem ludicrously small today.
Don't get me wrong, Besides being a lot of work, It was a lot of fun too. One thing I remember was our companies putting each other on the Beta Test list for our upcoming games.
I should clarify that I am talking about guys with wives who are stay at home moms and have kids in the single digit age range.
If the wife has not ever held a similar job (in terms of work demands, etc), and especially if she has never worked at all, she usually will have a huge misconception of what it's like for her Husband, which fuels the lack of respect. When you are waiting for something to compile, or thinking through a tricky problem, she will see you as just sitting there motionless, so to her of course you are "doing nothing" and are free, because her frame of reference usually involves physical labor or interaction.
As for small kids, you're can be screwed even with a locked door on your office. When I did the work-from-home remote contracting gig, My 4-year old son would pound on the door screaming his lungs out for me for 30 minutes straight (my now ex- wife was no help because of the above - she would actively send the kids to bother me so she could get a break).
Mod this up.
I was about to say if you have a wife and young kids - DON'T WORK FROM HOME. Your work won't be taken seriously, and you will be CONSTANTLY interrupted and your marriage will likely suffer.
Seriously, this has happened to myself and EVERY other guy I know who has young kids and tried working from home (admittedly, 5 guys total) . Their wives didn't respect the need for isolation, and saw them as available to watch the kids and do anything else they wanted to. They would interrupt any time they felt like it, and ignore repeated requests not to. To them, it was like "Hey I work from home too! He's just sitting here on the computer not doing much. If that was me, then I would be able to stop and do something else. He can take the kids while I take a nap and then go to a movie with my girlfriends!".
Mod parents up. I've been making games professionally for over 15 years, and I was going to mention both of those sites.
My general advice, when asked, "How do I learn to make games" is to 1) Look around online as there are communities and resources all over, and 2) Just start making something, anything. Do it rather than talk about it. Start with something small - an early Atari 2600-class game project - and expand as you become comfortable and more knowledgeable and experienced.
... because just before drive production went offline I finally outfitted my new home server with 9TB of storage for just $420. Pretty much my entire life, it's been that once I go and buy some computer hardware, two weeks (or however long the return period is) later, the price is guaranteed to be cut significantly (or a much better version is released).
Someone needs to check the alignment of the universe.
Something I noticed for the last three weeks was the absence of hard drives specials or sales in the weekly adverts from retailers like Fry's electronics.
When newegg sent out their "November Madness" and "Black Friday" emails to subscribes, there was *not a single* hard drive to be found in the sales.
Normally I'd try for a witty or insightful comment here, but I just don't have much more to say as I don't know if it more due to profit taking on the retailer's part, or if they are more concerned about running short/out of supply in the near future.
very coincidentally, I was having a conversation about Atari 2600 emulation last night, and it was suggested that "perfect" emulation of the early consoles might be impossible due to the change from CRTs to LCD TVs (and monitors).
The culprit in this case is the latency added by digital displays (and PC style video hardware) and packet-based input devices (USB, etc).
On pre NES hardware like the Atari 2600, the games would (at times) be synchronized to the video output signal of the CRT (see for a discussion), and they also had specialized video hardware which often did collision detection between various video elements (sprites, missiles, backgrounds, etc) meaning that results were detected as the frame was outputted, and available to the game code instantly .
This *can* be emulated perfectly by the emulator/ PC CPU
But these games ran their main loops at 60 hz (or 50 for PAL), and many of them required near perfect reflexes and timing.
Once the emulator has completed and rendered a frame of the old console game, how long until the player actually sees the result?
The answer is: It varies.
Will there be a 1/60th second delay before the video card swaps the rendered frame to the front buffer?
And how long will it take for the front buffer to be sent over HDMI/DVI to the LCD Tv set or PC Monitor? another 1/60th of a second?
And how long before the TV or monitor actually displays the frame? Another 1 or 2 60ths of a second? more? (The TV/Monitor takes time to buffer, filter/process and scale the image). You usually don't notice this on TV sets because the audio is buffered and delayed so it stays in sync with the video.
And now that the next frame is finally visible to the player, he/she can sees that they need to react to save their on-screen character, so they press the controller appropriately.
And how long does it take the USB adapter/controller to send to a packet to the PC, and for the PC to process it and make it available to the emulator? Compare that to the original hardware where pushing a button or joystick caused an individual circuit to open/close and whose status was polled immediately and directly by the console CPU. Maybe it's fast enough, maybe it adds another frame of latency.
In the end, with emulators we likely have a longer feedback loop from the emulator to the display to the player to the controller and back to the emulator compared to the original console and CRT displays, and many old games just won't play the same as a result.
We can emulate the game perfectly from the standpoint of the hardware simulation and audio/visual display, but still not get the play experience emulated perfectly because of changes in the feedback loop to the player.
I was under the impression that there were no public APIs for getting at the audio data from the call in progress,specifically to keep people from making apps that could record calls due to legality issues (wiretapping, etc, depending on your location and jurisdiction).
The "recorder" programs that are out there recording directly from the mic, and are usually not able to pick up the output from the speaker (and if they do, it's usually very faint). iPhones / iOS lack the capability for the same reasons.
I think a lot of people would find it very useful, for a number of various reasons, to have the ability to have their calls automatically recorded, with metadata of who, when, etc, stored in