Echo JS 0.11.0

<~>

jylauril comments

jylauril 1311 days ago. link 1 point
By bad idea I mean no-one should ever use this in production. But yeah, I know, it's a proof of concept. :)
jylauril 1312 days ago. link 1 point
Really cool implementation and all, but sounds like a really bad idea. :)
jylauril 1321 days ago. link 1 point
Very useful article! I've been debating over whether to go with newer version of Node or jump straight to IO.
jylauril 1327 days ago. link 2 points
Ugh.. separate success and error callbacks in node environment? Hello jQuery generation, welcome to browser development in year 2000.

Why not use Promises (since they're becoming standard anyways) or even the node standard single callback with first argument as error? The callbacks are a hell to maintain already as it is, so why do we have to contribute to it by introducing yet another BAD way to deal with them?

And don't get me wrong, I love the idea of standardizing modules and other common parts of code, but would be awesome if when you create a standardization, you would also use well known standards yourself.
jylauril 1418 days ago. link 2 points
Why do I keep seeing this same article pop up on ejs every other day?
jylauril 1419 days ago. link 3 points
"I find this the clearest way to deal with optional args and I find the logic very easy to follow." I would have to argue this very strongly. It is neither clearest way, nor easy to follow.

The logic on your code is so volatile and error-prone that I figured I'd open up some parts here:

1. Your logic of testing the type of the argument doesn't consider the user giving wrong type of parameter into the function. Say, I give the first parameter as a number instead of a string, and your test would assume that I didn't give the name parameter at all, and now you'd add one more argument into the args array, which would shift the number to second place(index 1) of that args array. Now the next test would fail, even though I would have given an appropriate parameter for that position originally. Also, what happens if I give null as the name parameter, which seems to be what you're replacing the name with if it's not a string?

2. You're "re-constructing" the arguments array in reverse, so the values are not in the same order you expect in the array destructor. Unshift pushes an element in front of the array, so consider I only pass in a function as a parameter to that method: What happens now is that your first check would push a null value in front of the array. Now the function is on second place and null on first. Next you check if the second parameter is an array (it's not, it's a function), so you push the empty array in front of the array again.. so the order of the array is now: [deps, name, fn]. If I didn't pass in ANY parameters to that array, the order would be: [fn, deps, name]


Here's some test cases for you:
require(null, [], function(){});
-> results to args array of: [function(){}, [], null, null, [], function(){}]

require('foo', 'myDependency');
-> [function(){}, [], 'foo', 'myDependency']

require('foo', function(){});
-> [function(){}, [], 'foo', function(){}]



Over all, writing the "perfect" handling for arguments always depends on the function in question and it's parameters. At times it's important to verify that given parameters are correct (ie, when you're providing a method that a 3rd party developer might call), and sometimes it might be a bit redundant in private functions/methods, where the application logic ensures that the passed in parameters are always correct somewhere higher up in the call chain.

Best way to deal with arguments is to always use one single object as an argument. The benefits of this are huge compared to a pre-defined order of possibly optional parameters. This method forces the caller to name the parameters that are passed in, which eliminates the problem of accidentally giving the parameters in wrong order, etc.

This way of doing it also prepares you for future(and even present if you're smart!) by making it easy to transition into using Promises instead of callback functions. Of course there are exceptions to this where clearly separating the arguments is more beneficial than clumping them into one. The general rule I've used is that if I have more than two parameters, I'll rather use an object.
jylauril 1437 days ago. link 4 points
Short answer to your question: yes, global modifier makes the RegEx instance "stateful".

Here's the reason: When you define a global modifier, the RegExp object keeps track of the lastIndex. Second time you run it and it doesn't match anymore, it resets it back to 0. You can access it by wtf.lastIndex. This is a writable property that you can reset back to 0 manually.

The reason why you don't see more people having problems with this is mostly because many people are misusing regexp literals by not caching the pattern. ie. always checking with /foo/.test(myString), which creates a NEW RegExp instance every time that is run.

You probably know this already, but just wanted to mention this to anyone not familiar with regular expressions: there's two things wrong with doing: /^foo$/g.test('foo')

1. You're expecting the expression to be very strict by demanding the matching string to start(^) and end($) with what you specified, which renders the global modifier redundant.

2. The purpose of the test() -method is to see if this expression matches once or more, so again the global modifier is redundant.
jylauril 1451 days ago. link 1 point
Oh wow, it was one very persistent cookie apparently. I've cleared cache a few times already and restarted browser and everything, but now that I removed all the .ponyfoo.com cookies, it started to work again. Phew. Thanks for the tip. :)
jylauril 1451 days ago. link 0 point
Seriously? Response for that page is 200 OK with no content.
[more]