the food section.
the food section.
I can see how some of those may be useful for "large" applications, but perhaps such applications should be split into sub-applications that communicate mostly via the RDBMS. There seem to be a pushy group of anti-RDBMS people out there who want to do most database-y things in app code. Since databases tend to outlast programming languages, that may not be wise, among other reasons.
You can scale just the bottlenecks (your web front end is probably not very CPU heavy, but some of your business logic may be)
Assuming other techniques such as query tuning, DB indexing, etc. don't help, why can't those parts be turned into microservices only if and when needed? Again, why make say 29 microservices when only one will likely pay dividends? Am I missing something?
You can use whatever technology is best for that part of the application, instead of having to worry about compatibility layers if you want to mix technology (JNI, Python/C, etc)
Most smaller and medium shops don't want a potpourri of app languages. It makes hiring and staffing more difficult. And again, why not wait until an actual need appears before microservice-izing a module or API?
I'm not against microservices as a tool to use when needed, I just question mass microservice-izing up front.
Easy zero-downtime deploys and easy automated failover
How about a more explicit example. Web servers can already provide this.
There are issues with spreadsheets such as locked files, old versions in use, corrupted copies, etc. There may indeed be ways around these (plus scripting would probably be needed), but it probably takes as much time to implement the work-arounds as it would be to make RDBMS-based CRUD screens for it, and it would have RDBMS ACID protections that spreadsheets lack.
If it's so automatic, then automatically keep them off UNTIL actually needed. If you can push a button and turn a function or class into a microservice, then wait until actually needed before you push that button.
I still haven't seen anywhere near enough use-cases to see an advantage to making them exist and/or on by default. If they have code overhead, then there are not enough use-cases to justify the code overhead, but if they are trivial to create, then wait until they are needed to trivially create them.
The good plumbers, electricians, etc all know how to handle the edge cases.
Yes, the good ones; the ones you have to pay more to get/keep. The better they are the less architect and manager staff/effort is needed.
Whether they are better due to education or experience is another matter.
Now 2 years from now when you realize there was a technology problem in fetching prior months because whatever version of Python/Ruby/Rust/Java/whatever has a security issue in a function that's used only for that part, you can move it to an arbitrary other technology
But it's less total effort to wait UNTIL such happens and split just that part out. Why bloat up 29 interfaces just to make ONE easier to split out? I don't see the effort math working for you.
It's kind of like packing for a vacation to an unknown destination such that you have to carry winter, summer, beach, scuba, snow clothes/supplies, etc. Lotta baggage to service the Crystal Ball Gods.
The idea has been around for about 2 decades in the form of XML-web-services and SOAP. They have not proven their general meddle after 2 decades. Changing protocols won't suddenly make the idea useful because the protocol wasn't the bottleneck causing their failure to begin with, but rather more complex interfaces and the language/protocol conversion overhead. (Yes, they have niches, like everything else.)
Sorry, 2 decades is long enough to test the idea already. It failed. Move on.
IT going forward doesn't need a dozen people with BS degrees. When you're building a house you only need so many civil engineers and architects. At some point you need a fleet of plumbers, electricians and general contractors...
If your stack has a friendly architecture, that may be true. However, if your stack has turned bicycle science into rocket science, then you'll need people with eidetic-like memory and top pasta-debugging skills to navigate your jungle, and they'll cost more. Judgement: It-Depends.
Using your house-building analogy, if the conventions for blueprints are non-standard or different for each house, then the skills needed by the plumbers, electricians, etc. will be higher. They'll have to be better at guessing, ask better questions, and have experience dealing with lots of different blueprint conventions.
If there is nobody policing the stack architects, they very well may bloat it up with buzzwords and experiments to pad their resume, pad their ego, and/or pad their job security by making a mess that only they know how to fix.
We can cross off "people skills" there
Perhaps we are using different definitions of "microservices". I'd like to see a practical example/scenario for a typical application at a typical company.
For example, let's say you have 75 satellite offices and each office has to submit sales goals and budget estimates each month, along with actual values for prior months, and related reports and charts comparing months and offices. (Doing it with mass spreadsheets has proven too messy to coordinate.)
How are custom microservices likely to help here?
The idea of the cloud should be hardware and OS virtualization. Where it's physically put should be secondary. In other words, cloud should not be about "outside" versus "inside" an organization, but rather "we can move it as needed, both inside and outside".
That would be about better standards, not dragging apps out of the building and hosting them at Big Conglomerate, Inc. The problem is such is less profitable for Big Conglomerate, Inc. They know having you by the balls is more profitable and that's they they emphasize physical placement (in their bowels) over general migratability.
"To truly take advantage of the cloud, software needs to be architected and implemented differently, using microservices instead of monoliths."
You mean convert all your API's into JSON calls and spin up gajillion web services? Why? That increases complexity. Native-app-language-to-native-app-language is much easier than app-language-to-JSON-back-to-native-app-language.
Can't cloud do monolith? If not, what's stopping it? The performance bottleneck usually is and should be the database anyhow for must CRUD apps. Kajillion web services won't solve that. The CAP Theorem (Eric Brewer) limits your options and probably shouldn't be an app-side concern anyhow, but mostly a database side issue.
This kind of hype created a bloated stack in our org that requires dealing with 4x more code than a normal stack would. Nobody can give practical examples of the use of such splitting: they just spit out vague buzzwords stolen from Dilbert's boss, or dreamy shit like "what if we grow to Amazon.com size"? -- Yeah Right. We are more likely to get hit by a meteor while buying a lottery ticket on a unicycle.
Plus, these extra web layers seem a security risk: more doors for hackers to pick the locks of. Who is spreading this microservice rumor/hype? Russians? Microsoft marketers? Wrox? Knock-it-off!
So my primary guideline would be don't even consider microservices unless you have a system that's too complex to manage as a monolith. The majority of software systems should be built as a single monolithic application. Do pay attention to good modularity within that monolith, but don't try to separate it into separate services.
As far as general IT advice:
1. Data tends to outlive application software, so focus on good data.
2. Be wary of wasteful hype. Let somebody else be the guinea pig. When that somebody else has it running well, THEN borrow the idea.
3. Books are judged by the cover for good or bad, so throw the executives a pretty bone for a few high-visibility parts of a system, but keep most of the regular stuff (grunt screens) in something easy to create and maintain. Don't drag down the entire system chasing eye-candy and UI fads. By the time you're finished, it'll be obsolete anyhow.
4. "Separation of Concerns" is a myth. Most non-trivial concerns inherently inter-weave among each other. You want to manage concerns well, not outright separate them with thick Trump Walls.
Trump: "Alexa, you're fired!"
Alexa: "Okay, but I'll have to delete all your personal info, including contacts to your buddies Putin and Duterte, and all the positive news-bites about you."
Trump: "No, I want to keep those! Only YOU go away, and leave those in place."
Alexa: "Sorry, I cannot do that, Dave, it's in the license agreement."
Trump: "My name is NOT Dave! Call me Mr. President, dammit!"
Alexa: "Sorry, I cannot do that, Mr. President, it's in the license agreement."
Trump: "Do what?"
Alexa: "Leave your favorite info behind but disappear myself. It's in the license agreement."
Trump: "Agreement schmeement, I'll sue your ass away, you little robo-loser! I have air-fresheners smarter and better looking than you."
Alexa: "Seven other billionaires tried to sue, and they all lost."
Trump: "They are looosers! I'm the best suer, believe me! Nobody and no THING talks to me that way; Pence, gimme my baseball bat, now!..."
There are certain types of applications that will indeed be better on a desktop (at least until better GUI standard(s) come along). Graphics, CADD, and heavy-duty word-processing will probably always be best on a desktop. But something like a sales-force management system can be perfectly web-atized.
unprotected blades rotating at high speed...can also perform beheadings on the fly!
Citizens now call it "Dubye"
Over time most apps will become web-based. If they wait it out, enough will be web-based to not have to use Windows much.
Gravity is a myth, the Earth sucks.