Not sure that I like monkey patching require quite like this... and tbh if you're using babel (for any number of reasons, even on the latest node), then you can already get encapsulation with "use strict";
It's funny, that when Apple announced the JS Automation APIs, I didn't quite understand why it wasn't an npm/node module in the first place... other than NIH for the engine.
It's relevant because code style, and project structure are as important as raw performance, especially given, that most applications don't need absolute responsiveness, or 500+ constantly updating fields/components from a feed.
I'm still not fond of the vue.js templating over React's JSX. To me, JSX feels closer to HTML, and having the component rendering *in* the component instead of as a weird binding side effect just feels better to me.
Is an async function just nicer syntactic sugar for a generator with a run(..) utility driving it?
No.
That's not quite accurate... most implementations are using generator semantics under the covers afaik... as such a function can't be async and a generator... and I don't think that will ever be possible/practical.
That said, if you were writing your own custom scheduler, the generator syntax and being able to yield nothing would be useful... by the same token, you could create a defer method, to be used inside an async function `await defer();`, I also wrap setTimeout in a sleep function so that I can `await sleep(500)` or similar...
const sleep = ms => new Promise(resolve=>setTimeout(resolve, ms));
const defer = () => run(*() => { yield; });
One *HUGE* niggle... I would *NEVER* outsource authentication records to a small service like auth0. I have trouble enough trusting Azure's Active Directory hosting, let alone the entire authentication and storing of user account logins.
What happens when/if auth0 folds up shop? What happens to your user accounts?
It's just a recipe for a huge potential risk of failure.
I've actually been a pretty big fan of JS since very early on. I remember using it with Netscape's server in the 90's, then again with classic ASP sharing business logic client and server long before node... for a while I kind of jumped ship for early ASP.Net, but there was so much friction and performance was pitiful for the overhead.
By 2003-2004 I was ready to jump back into JS, Firebird^wFirefox was coming into vogue, and Ajax wasn't there yet. However, many of the DOM differences had gone away and by the time we got jQuery, it was already gotten better. The Good Parts put to book form what I'd known about JS for years before. After node and npm came into being, it got so much better. I know setting up a webpack config can be cumbersome, but once you pass that hurdle, the rest of an app's development can be so much better[1]. I love React + Redux + modules for application development in the browser, and feel it should be done this way. I've seen many technologies come and go, adopted some, and passed over others. In the end, we're hitting a point where things actually feel like the right way to do things... less friction, easier refactoring.
You should use/wrap the web crypto api[0] where available[1].
var crypto = window.crypto || window.msCrypto; // for IE 11
var array = new Uint32Array(10);
crypto.getRandomValues(array);
Uint32Array should now be filled with cryptographically sound random numbers. There are JS based algorithms that generate better random numbers than Math.Random, but polyfills are a bit heavy.
If you're using webpack/browserify, the crypto library's random is polyfilled, which you can use safely.
[0] https://developer.mozilla.org/en-US/docs/Web/API/Window/crypto
[1] http://caniuse.com/#feat=cryptography
I'm not sure who you're referring to... Safari TP seems to have full ES2015 including modules, Chrome is ES2015 feature complete aside from modules and tail call opt. They're pretty much all over 90%, or very close to feature complete within a release target of about 3 months.
That said, It'll take a little time for older browsers to fall off (IE and Safari being the boat anchors), so we'll be stuck with babel+webpack for a while. All of that said, it will be nice to be able to have a build target that doesn't need regenerator or large shim libraries.
Is an async function just nicer syntactic sugar for a generator with a run(..) utility driving it? No. That's not quite accurate... most implementations are using generator semantics under the covers afaik... as such a function can't be async and a generator... and I don't think that will ever be possible/practical. That said, if you were writing your own custom scheduler, the generator syntax and being able to yield nothing would be useful... by the same token, you could create a defer method, to be used inside an async function `await defer();`, I also wrap setTimeout in a sleep function so that I can `await sleep(500)` or similar... const sleep = ms => new Promise(resolve=>setTimeout(resolve, ms)); const defer = () => run(*() => { yield; });You should use/wrap the web crypto api[0] where available[1]. var crypto = window.crypto || window.msCrypto; // for IE 11 var array = new Uint32Array(10); crypto.getRandomValues(array); Uint32Array should now be filled with cryptographically sound random numbers. There are JS based algorithms that generate better random numbers than Math.Random, but polyfills are a bit heavy. If you're using webpack/browserify, the crypto library's random is polyfilled, which you can use safely. [0] https://developer.mozilla.org/en-US/docs/Web/API/Window/crypto [1] http://caniuse.com/#feat=cryptography