The main issue with The Mandalorian is that it relies too much on nostalgia.
And it doesn't add anything to the mythos, it just takes.
That, and the godawful amount of filler. The episodes could easily be cut to a third of their length, without losing anything,
I thought Baby Yoda was a cute idea, for maybe an episode and a half. Then it became all Baby Yoda, all the time.
I made a parody series to try to spread the word, but it seems most people are easily snowed by nostalgia and obnoxious cuteness — "Mandalorian Abridged" still doesn't seem to have found its audience.
You probably don't know this. Epic have had more than just one hit game.
First released game appeared in 1992 as Epic (and one before that in 1991 with another company name) I really dare you to put out a full blown release of a game software product running on just one platform, even with today's free tools and git-ware. It takes a lot more time than you can imagine.
In Those 30 years, the tiny company Epic managed to outsmart Id, Sony, Nintendo, Electronic Arts, Valve, Microsoft, Ubisoft, Crytek, Take2, Vivendi, Infinity Ward, Bioware, Capcom, TT, SquareEnix Sierra, Dynamics, Microprose, ActiBliz, Rockstar, Naughty Dog, Mojang, Dice, Treyarch, Bethesda and many more. That's a hell of lot of industry competition to handle. They did not simply manage, they were setting the rules at every step of the way.
There is a raytracing extension to Vulkan and OpenGL which allows an optimized kd-tree to be stored in a buffer and interrogated using custom instructions.
It's quite simple. You take your scene, slice and dice it into triangles, even parametric surfaces like NURBS, subdivision surfaces used by Pixar, 3D models from 3Dmax, Maya, Blender. All of that gets converted into textures, material shaders and geometry mesh. The geometry mesh gets chopped up into a hierarchical bounding volume like a kd-tree. All of this can be stored in a data format loaded straight into the GPU or CPU cache. It's all vectors, matrices and parametric coordinates. Separate processors are assigned to process each pixel and do thing like supersampling, ambient occlusion, global illumination, caustics and radiosity. Nvidia have their Optix ray-tracing library. Intel have Embree.
Our jobs have already been robotized many times. Email replaced printing out documents and dropping them into someone else IN tray. Compilers replaced hand-coding assembly. Washing machines replaced hand-washing.
Those jets of charged particles would generate magnetic fields and pull the quasars or black holes together, or maybe line them up in neat little rows.
If you were to take 100 ball bearings, scatter them one by one on the classic rubber sheet example, they would each just have a small gravity well. If you let them all fall to the center, then the sheet would be deformed way more.
If that were a Children's petting zoo, and they were feeding the critters live baby chicks, then the devil enclosure would be quite appropriate.
Unfortunately, many of those patents for performance enhancing features using out-of-order execution were based on a single research paper. That was implemented in one CPU vendor design, then cross-patented to other CPU vendors. RISC-V has the advantage that it doesn't have those vulnerabilities baked in and built upon.
That would be insane - an SSD disk drive with a built in GPU / compute engine. That would get close to the "take the CPU to the data" approach for big data processing.
Given that massively heavy objects in space stretch space time, then it seems logical that a quasar could actually create it's own massive gravity well. From our perspective, looking straight at that gravity well, the quasar would appear to be billions of light-years away than it really it. If for any reason, it suddenly disintegrated into lots of smaller objects in the same way of a cloud of a sparks created by a magicians disappearing trick, then that gravity well would suddenly disappear and be replace with the stars of a a galaxy. Then that galaxy of stars would appear to be way closer than the quasar.
Almost anything derogatory you could say about today's software design would be accurate. -- K.E. Iverson