While the article is largely factually correct, it's definitely biased towards Vue. React (core) is largely about the view/rendering engine and less about state management or application structure. It works very well with the broader community and resources in the npm ecosystem.
Regarding state, there are other options for React that are closer to VueX, such as MobX, and GraphQL has been gaining popularity as well. Not to mention React's Context (Provider/Consumer) options and the newer Hooks for state.
In my opinion and experience, React is much better for building larger applications with teams of developers. This has been the case compared to every other framework that I've worked with and that's been a lot including Angular, AngularJS, Vue, Backbone, Ember, and others going back over two decades. React + Material UI and JSS has been a far nicer experience for developing web based applications than any other set of tooling I have ever used by a large margin.
That doesn't mean don't look at alternatives... Blazor is interesting, even if the default directory structure is ugly (you can adjust it) as well as other WebAssembly targetted options... Yew (Rust) is really cool, though my own knowledge of Rust is really shallow. Always consider alternatives... There was a lightweight framework a few days ago that was pretty cool. It really depends on what your needs are.
If I were trying to enhance a mostly static application already in place, I might reach for Vue or something even lighter. If I have to build an application with a dozen other developers, it's React hands down. On Angular, I'd rather use TypeScript with React than deal with Angular's overly prescriptive complexity. I like a code-base that is largely discoverable and as simple to use/enhance as possible, Angular doesn't allow for that.
> during the browser wartime IE vs Netscape’s Navigator
It's worth Noting that jQuery first came out in 2006, Firefox/bird had then been out for around 3 years, and was preferred over Netscape Navigator at that point. IE6 had been out for around 5 years itself.
The problem at that time is that many (most?) people were still on dial up connections, and downloading a new browser was still problematic for many. This meant that many were still on IE6, which had a market share in the high 80%'s, though by 2009 it will be around 70%. The alternative browsers (mostly Firefox) had been gaining steady ground, Chrome bursted in 2008. Each had some annoying quirks (mostly IE6, again 5yo at that point).
jQuery bridged a *lot* of gaps in dealing with DOM selection and events by normalizing the interfaces that were often very different with some weird bugs all around. Beyond this, if you used *only* jQuery to register and unregister your event handling, it dealt with some really bad memory leaks (mostly IE, but others too).
As the article says though, times have changed significantly, today the major/current browsers are much more alike than different. Most of the missing features aren't used much (yet) and mostly easily polyfilled. Tooling is also much better, out of Node.js came common-js packages, which brought browserify and followed by webpack, rollup and others combined with 6to5/Babel. Today, with modern tooling, you can write modern JS, use modern UI frameworks and create incredible web based applications with browsers that perform well, and have very little variance (if you can exclude IE altogether).
In the early-mid 00's, I was a big fan of E4X (EcmaScript for XML) which allowed me to use really nice rendering approaches with Mozilla browsers, flash/flex and VB.Net on the server. JSON (Ajax/Fetch) and React are much better, but didn't come around until over half a decade later.
On "alternative libraries", what jQuery did was much lower level of an abstraction than any of the frameworks/libraries mentioned and aren't really the same... They can do the same types of work, but aren't really the same at all.
In the end, I'd much rather be a web developer today than 15-20 years ago. It's so much nicer now, despite a lot of noise from detractors. Overall, relatively nice article, if not very technical from flatlogic (vs the typical self-promotion sales of templates they try to post about).
Having tried all of the editors listed, VS Code is hands down my favorite. What is funny is when I first heard about it, I almost rejected it without trying it because of poor experience with Atom and Brackets around that time.
I understand why Yarn was started... but given that npm proper supports most of the enhancements that yarn brought in terms of performance, I'm not sure I get yarn's role at this point other than remnant popularity.
Kind of a cool naive approach. Would not use this on very large text or for large collections of strings for the most part, it may be usable as part of a systematic map-reduce algorithm.
Another thing to consider would be creation of word stems converted with phonology for comparison, and a relation score instead of true/false. Which would be closer to what search engines actually do.
It's worth mentioning that there's a search cost for properties in the prototype hierarchy, so doing more than 1-2 inheritance levels is generally a bad idea depending on your use case and performance needs.
Mainly in that maintaining it can become an issue... also, if you exceed a certain size it won't be indexed, so you'll have to reference other sitemap files, and/or generate them. I've found you're far better off ensuring only the recent endpoints that are updated since the last google scan are about best. And even then, google (and bing) will do a good job at natural discovery. The sitemap really only allows for those sites with frequent publication to be checked more often by google. If you are doing less than a couple posts a day, there's very little reason to worry about it.