A better subject would probably be compression, and even pre-compression. Especially with zopfli (gunzip compatible) and resource hashed names.
Along similar lines, setting appropriate headers in s3 and similar so that the gzip is mentioned in the content header, since pretty much every browser supports this, and anyone not using a browser should be able to figure it out.
Yes, minification helps, but compression is almost as big of a boost on top of that, and well worth it.
Beyond this, are going to be png palette quantization and compression for image resources. If your images really only have a couple of color variations, getting down to 16-32 colors (or even just under 256) can help a lot in terms of png compression. Don't use png for photos other than text scans. You'll often get down to around 1/5 or so of the original size between quantization and max zopfli compression (gzip variant).
Across a project I'm working on that includes scanned images, this leads to massive storage and transmission savings for the application itself.
Above all else.. read some of the Sauders books on web performance... most of it still holds, but above all else, test, test, test and confirm.
Curious which/what dealerships or car mfgs are onboard with this or driving this development. I think the biggest issue is ensuring some level of human review and shorting a workflow at any given step before the purchase itself.
There are a lot of shady dealings and scammers around auto sales... Having certain assurances and safeguards is definitely important here.
Disclosure: I used to work for a used auto classifieds site, and own a domain I'd intended to use for that same purpose.
Two problematic points... if you hae an OpenAPI/Swagger doc or JSON Schema, you can generate more appropriate types tan inflection.
Also, there are tools in npm for this, you don't need to use a random online tool.
This is kind of cool to see... but to be really honest, I absolutely hate compile time PFM that isn't obvious. I've been participating on a project where the entire file based routing is like that. It's auto-generated at run/build time.
On the flip side, I've created a handful of web components for some mostly stand-alone reusable bits, and I'm thinking that in concert with some decent bundling for styling, that there might be something much simpler, and closer to standards based that could come out of all of this.
I still wish that e4x had taken off a few decades ago, even if JSX is close enough, it still really isn't in the box, and I wish that it was. Makes me want to look into some of the old on-demand based loader/parser projects from early react versions.
It's not just for JS/TS, but since there are client libraries that support JS/TS with this, I figured I'd post here since I think it could be really useful.
Slightly disappointed there's no Deno library, or that the published library doesn't support all three.
That's my concern as well.. FWIW, you can dynamically set the content in most UI frameworks for the raw SVG content, which can use CSS properties for attribute usage beyond a single color... so you could use var(--brand-color) and match against body.dark or body.light for adjusting an accent color as well.
Assuming your light/dark integration changes the html or body element as appropriate, most will/do just that. I will generally detect for localStorage falling back to native preference, then set the html element appropriately as well as integration with my UI toolkit as such. I do similar for handling various side-menu states combined with breakpoint integrations.
If you have the "real" content, then why would you need a skeleton? I get that it uses the DOM values... but you'd need to have a placeholder for say "Last, First" or "123 any street" while the real data is loading, one would assume while the real data is loaded... that's the point of skeletons, generally speaking.