I am trying to get the documents which were inserted into MongoDB in the last minute only and once i get them i am trying to merge them as one and insert them into a new Collection.
For example: These are my documents in "Prices" Collection:
{_id: someID, timeStamp: 1629131116370, name: Jazzy},
{_id: someID, timeStamp: 1629131117370, name: John},
{_id: someID, timeStamp: 1629151191370, name: David},
{_id: someID, timeStamp: 1629151192370, name: Julie},
{_id: someID, timeStamp: 1629151193370, name: Hobit},
Now what i want to do is create a function which gets all the documents of the last minute let say in my case it is the last three documents i.e (David, Julie, Hobit) and merge into one and also shows which minute was started and when it ended in utc timestamp milliseconds only.
{_id: newSomeID,
minuteStarted: 1629151140000, <----- Please keep in mind this is the start of that minute.
minuteEnded: 1629151200000, <----- Please keep in mind this is the end of that minute.
allDataofthisminute: [
{_id: someID, timeStamp: 1629151191370, name: David},
{_id: someID, timeStamp: 1629151192370, name: Julie},
{_id: someID, timeStamp: 1629151193370, name: Hobit}
]
}
What i have Tried:
I surely know that it is a complex logic to make. what i tried is first get the current timestamp (in milliseconds) and minus the milliseconds which is passed from this minute and i got the last minute ended timestamp. Then i subtracted 60,000 from the value and i got the last minute started timestamp.
now i have the last minute started timestamp and minute ended timestamp and i tried many things but all in vain. I know this is too much to ask but i just want to understand things more clearly.
var currentTimeStamp = new Date().getTime();
var currentMilliSeconds = Date.now() % 60000;
var lastMinuteStarted = currentTimeStamp - currentMilliSeconds - 60000;
var lastMinuteEnded = currentTimeStamp - currentMilliSeconds;
Now how should i place a $and query to get all documents range within the lastMinuteStarted and lastMinuteEnded. i hope i have explained well. Thanks
Update: I solved it with big aggregation pipeline. if someone improves it or has better suggestion please you are more then welcome to help me out.
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments