this post was submitted on 03 Dec 2023
2 points (100.0% liked)

General Programming Discussion

185 readers
3 users here now

A general programming discussion community.

Rules:

  1. Be civil.
  2. Please start discussions that spark conversation

Other communities

Systems

Functional Programming

Also related

founded 5 years ago
MODERATORS
 

One needs to send 1 mln HTTP requests concurrently, in batches, and read the responses. No more than 100 requests at a time.

Which way will it be better, recommended, idiomatic?

  • Send 100 ones, wait for them to finish, send another 100, wait for them to finish… and so on

  • Send 100 ones. As a a request among the 100 finishes, add a new one into the pool. “Done - add a new one. Done - add a new one”. As a stream.

top 8 comments
sorted by: hot top controversial new old
[–] deegeese@sopuli.xyz 6 points 11 months ago (2 children)

LOL your last post got helpful answers until you were super rude and an admin deleted it.

[–] GammaGames 5 points 11 months ago (1 children)
[–] cuenca@lemm.ee 2 points 11 months ago

I knew that you'd like it.

[–] cuenca@lemm.ee 1 points 11 months ago* (last edited 11 months ago)

ZOG yes. Until you came in it.

[–] peter@feddit.uk 5 points 11 months ago (1 children)

Try asking this question 1 million times

[–] cuenca@lemm.ee 1 points 11 months ago

That's what I'm doing.

[–] vmaziman@lemm.ee 2 points 11 months ago* (last edited 11 months ago) (1 children)

Maybe producer consumer?

Producer spits out all the messages to send out onto a message queue, fifo or whatever suits u.

Parrallelizable consumers (think deployed containers) listen to queue and execute request, get response and save it

Scale consumer count up or down as you need to deal with ratelimits

[–] cuenca@lemm.ee 1 points 11 months ago

What question have you answered?