Echo JS 0.11.0

<~>

tracker1 comments

[comment deleted]
tracker1 2106 days ago. link 2 points
This project does not have a license assigned to this project, meaning nobody can/should even look at it.
tracker1 2106 days ago. link 1 point
Yes.  There are a few conserns.  Regarding @misan's points.  You can write accessible react and you can add the same structure and attributes since in the end it does render HTML to the DOM.

As to SEO, it's not really much of a hindrance.  I've worked in a couple larger sites and hosting environments and 5-7 years ago, Google was only a few days behind in computed detection vs. static rendering and can only assume it's much closer to actual today.  Bing also does computed renders for content.  Beyond this, SEO doesn't actually help much these days compared to general marketing efforts.

You can server-render and cache React, there are lots of tools for this from Next.js and other frameworks.  Walmart has written and released a lot of tooling in this space.  Again, I don't think it's that worth it most of the time, but ymmv here.

Where I would concentrate is using webpack's analyze options to pay close attention to your gzipped payload size.  Ideally you want to stay under or around 500k.  Which is not hard, I'm working on a pretty complex web application and it's around that with React and Material-UI libraries taking up the bulk and about 25% actual application code beyond that.  For comparison, Grubhub is about 442k for their initial (not logged in) JS payload and over 2MB on a fresh load logged in (Angular I believe).

If you're more content driven (blog, resume, etc) I would tend to favor a static content generator such as gatsby mentioned by @misan.
tracker1 2108 days ago. link 1 point
Only criticism is this is using a blocking sync interface... don't do this in a service app that receives requests from externally.  You could do largely the same with async, you'd have to replace the array methods with for loops though.

On a very large/deep directory structure (root on a unix system), you'll likely get a stack overflow exception.
tracker1 2109 days ago. link 2 points
This caching is per instance... the main issue with caching in memory, especially in something like a cloud function is functions aren't really long-lived.  It's different if you're running few instances of a persistent service, quite another if you're in a FaaS.

Depending on the round-trip-time to the query service, it may be better to not cache.  If it's expensive, then it's likely best to cache in an indexed key/value store such as redis or other local/near DB.
[more]