Why does wget download files faster than a browser?

Gavin Martin
Gavin Martin

The difference is not small, either. It is like 10 times faster. I don't even bother using browsers to download files anymore. But I'd like to hear the explanation.

All urls found in this thread:
https://www.youtube.com/watch?v=u3WWOvFf7Go
https://www.youtube.com/watch?v=by58arnoV4c
James Ramirez
James Ramirez

It is like 10 times faster.
Something is seriously wrong with your browser. Which is it, what are you using to test this?

Julian Martinez
Julian Martinez

It's the same with many browsers. Just did a test - ran a download of a zip file first with browser, then with wget. Wget finished first and browser was still at 30%.

Carter Lewis
Carter Lewis

Are you using Windows and have antivirus? The browser can summon the AV for downloaded files directly.

Michael Cox
Michael Cox

Your shit's fucked, son. You might have malware.

Brandon Collins
Brandon Collins

why does a simple, dedicated specific downloading tool work faster than a shifty bloated javascript parsing nightmare written and rewritten in a higher level language
'general purpose' anything tends to be a bad idea
I'd blame C++/Rust compilers
"jack of all trades, master of none"

Dominic Cox
Dominic Cox

it doesn't always work like this.
but you're too young to understand.

Evan Sanchez
Evan Sanchez

I was thinking about this recently when I realized flashgot would let me download files via wget. Why? Why does it let me do that? What do I have to gain from it? Very strange.

Henry Howard
Henry Howard

Firefox's engine or UI isn't written in JavaScript, dumbass.

Hunter Ward
Hunter Ward

t. microshit pajeet

Ayden Martin
Ayden Martin

Wrong, the UI is. Around 1/4 of it's 30mio loc is js sweety.

Elijah Edwards
Elijah Edwards

I assume it splits it into segments and downloads multiple of them in parallel. Try comparing it to firefox esr with legacy downthemall.

Aaron Cooper
Aaron Cooper

It takes time for the viruses to inject themselves into your downloads

Liam Nguyen
Liam Nguyen

Wget finished first and browser was still at 30%.
First, this means your claim of "10 times faster" is false. Second, this test is inaccurate because you are running the tests simultaneously, which means each connection is fighting for dominance and has unstable speeds. Test separately and then return with the results.

Dylan Nelson
Dylan Nelson

Try jdownloader, I max my connection out with it for sites that allow more than one connection.

Nathaniel Howard
Nathaniel Howard

I'd blame C++/Rust compilers
Should have written Firefox in Ada. Here's your (you).

Samuel Bailey
Samuel Bailey

java
proprietary

Justin Reed
Justin Reed

This place really gets boring.

Zachary Diaz
Zachary Diaz

IIRC most browsers don't allow multithreaded downloads.

Colton Thomas
Colton Thomas

probably a good thing for those of us that use this, they would block it if it were a problem

Liam Lewis
Liam Lewis

that gif
mfw it's real and not some Holla Forums doing
https://www.youtube.com/watch?v=u3WWOvFf7Go
Truly best java tutorials available:
https://www.youtube.com/watch?v=by58arnoV4c

Ayden Miller
Ayden Miller

user, that is a terrible way to do a speed test. The connections are fighting for priority, and there are multiple factors that can decide which one is favored. All you've proven is that wget is most likely given priority over the browser.

Nathan Young
Nathan Young

Kill yourself.

Colton Russell
Colton Russell

simple, dedicated specific downloading tool
simple
Take a scroll through wget's man page.

Logan Price
Logan Price

download files via wget. Why?
I use that to download things in a machine that's on all the time so if the file is huge I can turn my laptop off and still finish it.

Daniel Ross
Daniel Ross

firecox's download functionality is complete garbage. it had this bug where it says you're getting 800KB/s but you actually have 200KB/s, and it sits like that for hours. even the new FIREFOX QUANTUM SUPER ULTRA FAST ROBUST FEARLESS CONCURRENCY has the same bug. nuff said. and no you can't always use an external downloader because it's not always easy to copy the context (cookies, headers) from gay websites to wget or curl

Connor King
Connor King

I agree, the constant shilling of proprietary or semi-proprietary software is quite lame.

Lucas Sanchez
Lucas Sanchez

wget's a mess anyway, what you want is curl.

Gabriel Mitchell
Gabriel Mitchell

wget and curl have use cases that barely overlap. If you need wget curl is useless. If you need curl wget is useless. If you can use either then your use case is so trivial that you could even use lynx.
curl's creator recommends wget. curl vs wget is the single dumbest argument in the linuxsphere, and that's saying something.

Jace Howard
Jace Howard

First, this means your claim of "10 times faster" is false.
Nope. Depends on how long the rest of the browser download took.

Adam Evans
Adam Evans

2017
not using aria2 instead of wget
not utilizing multiple connections per download for 420 blazing fast downloads

Ethan Cook
Ethan Cook

Lots of background processes, etc etc.

Most browsers/dl agents open a few parallel connections to download a file, browsers limit the amount of active connections so you can continue browsing while a download is going on. That alone slows the speed way down.

I believe wget doesn't verify the same way browsers do, most browsers get a packet and then send a packet out to verify the packet they got is correct and in the correct order. wget just dumps the shit into a file IIRC.

Easton Nguyen
Easton Nguyen

so tldr; OP.
download agents have been optimized to download. Firefox has been optimized to work as your webbrowser.
This means they balanced your download speeds with the fact you PROBABLY want to continue browsing.
It can be overridden and dl speed optimized a bit better, but that's a RTFM question tbh OP.

Evan Sanchez
Evan Sanchez

Most browsers/dl agents open a few parallel connections to download a file, browsers limit the amount of active connections so you can continue browsing while a download is going on. That alone slows the speed way down.
HTTP as I know it doesn't let you use multiple connections for a single request.
I believe wget doesn't verify the same way browsers do, most browsers get a packet and then send a packet out to verify the packet they got is correct and in the correct order. wget just dumps the shit into a file IIRC.
You're describing TCP. HTTP always uses TCP. Your browser and wget probably use the same TCP implementation as provided by your operating system.

Adam Rivera
Adam Rivera

HTTP as I know it doesn't let you use multiple connections for a single request.
I don't think that was what he meant but rather limiting the amount of concurrent downloads happening at the same time.
Albeit nothing does it, you can use multiple connections for a single request by setting the Range header value which allows you to specify what bytes you want to download. This is most commonly used for seeking an audio or video file.

Benjamin Johnson
Benjamin Johnson

You're describing TCP. HTTP always uses TCP. Your browser and wget probably use the same TCP implementation as provided by your operating system.
probably

go look at the source code for wget, newfag. less posting, more doing at this stage in your development.

Ayden Peterson
Ayden Peterson

More effort than I'm willing to invest in a post on a Filipino pig farm franchise.

Brandon Wilson
Brandon Wilson

Less than 5 minutes. MSDOS port includes tcp.h everything else uses sockets directly.

lazy underage redditor please leave.

Samuel Lopez
Samuel Lopez

he doesn't even know what a socket is
he doesn't see it being set to use tcp in connect_to_ip

Tyler Sullivan
Tyler Sullivan

I write these posts on my phone while taking a shit, that's genuinely more effort than I'm willing to put in.

Benjamin Murphy
Benjamin Murphy

it's not always easy to copy the context (cookies, headers) from gay websites to wget or curl
<open console
<go to network tab
<click on link
<right click on connection that popped up
<"Copy as cURL"
Psssh nottin personnel kid.

Isaiah Mitchell
Isaiah Mitchell

HTTP pipelining.

Levi Rogers
Levi Rogers

I use wget over browsers because it seems like browsers are shit for flaky downloads. A lot of the time it doesn't even tell me the download failed, I have to find out when the archive is totally fucked or whatever.

Liam Butler
Liam Butler

mother fucker now Google allows it. rip sites allowing more than one connection

Jeremiah Powell
Jeremiah Powell

Where is #include <winsock.h> faggot?

kys

Parker Gomez
Parker Gomez

caring about Windows
The <winsock2.h> is included by src/mswindows.h, or it might also be from some stuff done in the configure script.

Disable AdBlock to view this page

Disable AdBlock to view this page