I run a SearXNG instance and rate limiting has never been a issue until now. so i wanted to ask what the cheapest and most privacy respecting vpn/proxys are to use.

  • PeriodicallyPedantic@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    Can you explain a bit more about the setup?

    Like would just using a VPS with pangolin secure tunnel work? Or does it have to be a vpn for some other reason?

    Asking because maybe the question really boils down to the VPS provider with the best data transfer rates

  • sem@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    So like, if you passed on the rate limiting to your users, would you get less rate limited from the search providers that SearXNG uses?

    • Drunk & Root@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      no searxng functions as a proxy between the user an the search engine so if the search engine is blocking/rate limiting the searxng servers ip the only option is to use a proxy/vpn but the issue i run into with the vpn is i cant just install it because it will make ssh and the domain configuration not work so the only options are install the vpn inside of the docker container or if searxng has a config setting for it wich i need to look into still

  • shrek_is_love@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 days ago

    Take this with a grain of salt since I haven’t done much scraping (yet; I have a project I just started planning)

    I’ve heard you’re more likely to get blocked for using a VPN since some sites will block requests originating from data centers, which is less likely to happen coming from a residential IP address. (Although if you’re already using a VPS, the right proxy may help)

    This might be useless advice, but it might just be best to increase (and randomize) the amount of time between requests.

    And to answer your question, Mullvad is what I use, and it’s what I see reccomended (from both Reddit, and The Wirecutter) the most often because they store so little of your data, and you can even pay in cash.

    • SteveTech@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 days ago

      Nginx also has support for rate limiting built in.

      On the topic of blocking, I block useragents starting with Mozilla/5.0 that are using HTTP/1.X, since all modern browsers default to HTTP/2.0 and anything else is usually always bad bots. You can also return 426 with the Upgrade: h2c header to let some older browsers know to use HTTP/2.0.