The multi objects in node_redis are very inexpensive to create. As a side-effect, I thought it would be fun to let you re-use them, but this is obviously only useful under some circumstances. Go ahead and create a new multi object every time you need a new transaction One thing to keep in mind is that you should only use multi if you actually need all of the operations to execute atomically in the Redis server.
If you just want to batch up a series of commands efficiently to save network bandwidth and reduce the number of callbacks you have to manage, just send the individual commands, one after the other. Node_redis will automatically "pipeline" these requests to the server in order, and the individual command callbacks, if any, will be invoked in order.
The multi objects in node_redis are very inexpensive to create. As a side-effect, I thought it would be fun to let you re-use them, but this is obviously only useful under some circumstances. Go ahead and create a new multi object every time you need a new transaction.
One thing to keep in mind is that you should only use multi if you actually need all of the operations to execute atomically in the Redis server. If you just want to batch up a series of commands efficiently to save network bandwidth and reduce the number of callbacks you have to manage, just send the individual commands, one after the other. Node_redis will automatically "pipeline" these requests to the server in order, and the individual command callbacks, if any, will be invoked in order.
I'm playing around with node. Which implies that the internal multi. Queue object is never cleared once the commands finished executing.
My question is: How would you handle the situation in an http environment? In this case, multi. Exec would execute 1 transaction for the first connected user, and 100 transactions for the 100th user (because the internal multi.
Queue object is never cleared). Option 1: Should I create the multi object inside the http. CreateServer callback function, which would effectivly kill it at the end of the function's execution?
How expensive in terms of CPU cycles would creating and destroying of this object be? Option 2: The other option would be to create a new version of multi.exec(), something like multi.execAndClear() which will clear the queue the moment redis executed that bunch of commands. Which option would you take?
I suppose option 1 is better - we're killing one object instead of cherry picking parts of it - I just want to be sure as I'm brand new to both node and javascript.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.