Why does wget download files faster than a browser?

The difference is not small, either. It is like 10 times faster. I don't even bother using browsers to download files anymore. But I'd like to hear the explanation.

Other urls found in this thread:

youtube.com/watch?v=u3WWOvFf7Go
youtube.com/watch?v=by58arnoV4c
addons.mozilla.org/en-US/firefox/addon/cliget/
github.com/rockdaboot/wget2
direct.emuparadise.me/roms/get-download.php?gid=168
direct.emuparadise.me/roms/roms.php?gid=168
direct.emuparadise.me/roms/roms.php?gid=168
direct.emuparadise.me/Sega_Dreamcast_ISOs/Shenmue_(USA)/168
projects.gnome.org/gwget/
doperoms.com/files/roms/sega_dreamcast/GETFILE_Shenmue v1.001 (2000)(Sega)(PAL)(M4)(Disc 1 of 4)[!].zip
twitter.com/SFWRedditImages

Something is seriously wrong with your browser. Which is it, what are you using to test this?

It's the same with many browsers. Just did a test - ran a download of a zip file first with browser, then with wget. Wget finished first and browser was still at 30%.

Are you using Windows and have antivirus? The browser can summon the AV for downloaded files directly.

Your shit's fucked, son. You might have malware.

'general purpose' anything tends to be a bad idea
I'd blame C++/Rust compilers
"jack of all trades, master of none"

it doesn't always work like this.
but you're too young to understand.

I was thinking about this recently when I realized flashgot would let me download files via wget. Why? Why does it let me do that? What do I have to gain from it? Very strange.

Firefox's engine or UI isn't written in JavaScript, dumbass.

t. microshit pajeet

Wrong, the UI is. Around 1/4 of it's 30mio loc is js sweety.

I assume it splits it into segments and downloads multiple of them in parallel. Try comparing it to firefox esr with legacy downthemall.

It takes time for the viruses to inject themselves into your downloads

First, this means your claim of "10 times faster" is false. Second, this test is inaccurate because you are running the tests simultaneously, which means each connection is fighting for dominance and has unstable speeds. Test separately and then return with the results.

Try jdownloader, I max my connection out with it for sites that allow more than one connection.

Should have written Firefox in Ada. Here's your (you).

...

This place really gets boring.

IIRC most browsers don't allow multithreaded downloads.

probably a good thing for those of us that use this, they would block it if it were a problem

youtube.com/watch?v=u3WWOvFf7Go
Truly best java tutorials available:
youtube.com/watch?v=by58arnoV4c

user, that is a terrible way to do a speed test. The connections are fighting for priority, and there are multiple factors that can decide which one is favored. All you've proven is that wget is most likely given priority over the browser.

Kill yourself.

Take a scroll through wget's man page.

I use that to download things in a machine that's on all the time so if the file is huge I can turn my laptop off and still finish it.

firecox's download functionality is complete garbage. it had this bug where it says you're getting 800KB/s but you actually have 200KB/s, and it sits like that for hours. even the new FIREFOX QUANTUM SUPER ULTRA FAST ROBUST FEARLESS CONCURRENCY has the same bug. nuff said. and no you can't always use an external downloader because it's not always easy to copy the context (cookies, headers) from gay websites to wget or curl

I agree, the constant shilling of proprietary or semi-proprietary software is quite lame.

wget's a mess anyway, what you want is curl.

wget and curl have use cases that barely overlap. If you need wget curl is useless. If you need curl wget is useless. If you can use either then your use case is so trivial that you could even use lynx.
curl's creator recommends wget. curl vs wget is the single dumbest argument in the linuxsphere, and that's saying something.

Nope. Depends on how long the rest of the browser download took.

...

Lots of background processes, etc etc.

Most browsers/dl agents open a few parallel connections to download a file, browsers limit the amount of active connections so you can continue browsing while a download is going on. That alone slows the speed way down.

I believe wget doesn't verify the same way browsers do, most browsers get a packet and then send a packet out to verify the packet they got is correct and in the correct order. wget just dumps the shit into a file IIRC.

so tldr; OP.
download agents have been optimized to download. Firefox has been optimized to work as your webbrowser.
This means they balanced your download speeds with the fact you PROBABLY want to continue browsing.
It can be overridden and dl speed optimized a bit better, but that's a RTFM question tbh OP.

HTTP as I know it doesn't let you use multiple connections for a single request.
You're describing TCP. HTTP always uses TCP. Your browser and wget probably use the same TCP implementation as provided by your operating system.

I don't think that was what he meant but rather limiting the amount of concurrent downloads happening at the same time.
Albeit nothing does it, you can use multiple connections for a single request by setting the Range header value which allows you to specify what bytes you want to download. This is most commonly used for seeking an audio or video file.

go look at the source code for wget, newfag. less posting, more doing at this stage in your development.

More effort than I'm willing to invest in a post on a Filipino pig farm franchise.

Less than 5 minutes. MSDOS port includes tcp.h everything else uses sockets directly.

lazy underage redditor please leave.

...

I write these posts on my phone while taking a shit, that's genuinely more effort than I'm willing to put in.

...

HTTP pipelining.

I use wget over browsers because it seems like browsers are shit for flaky downloads. A lot of the time it doesn't even tell me the download failed, I have to find out when the archive is totally fucked or whatever.

mother fucker now Google allows it. rip sites allowing more than one connection

Where is #include faggot?

kys

The is included by src/mswindows.h, or it might also be from some stuff done in the configure script.

I don't know and I don't care, I just use this
addons.mozilla.org/en-US/firefox/addon/cliget/
with aria2c

80% of new machines have malware/spyware

ppl that dont know shit about computers or internet

WGET IS THE ALPHA
THE APEX DOWNLOADER
IT CANNOT BE DEFEATED
Puny browsers cower in fear at its indomitable network priority

Your inability to reverse engineer software does not affect my opinions on what is considered Holla Forums

This is somewhat unrelated, but I like scraping images from porn boards using it (like any normal person). But is there any way to expand the jpgs before wget fetches them? I've been wondering this for years now but I can't find an answer.

define "expand"

life is too short to reverse engineer everything you want to use.
if the lifespan was not limited so unfairly, it could work.

Here's a script I use on the mewch hentai board occaisionally. I don't claim that's it's well made. I sure someone that writes shell scripts more often than me could do much better, but it works.

#!/bin/bashtmpfile=$(mktemp /tmp/mewchdl.XXXXXX)tmpfile2=$(mktemp /tmp/mewchdl2.XXXXXX)wget -qO- $1 | grep -e originalNameLink > $tmpfilesed "s/ $tmpfile2sed "s/\" download=\"/ /g" $tmpfile2 > $tmpfilesed "s/\">.*//g" $tmpfile > $tmpfile2while IFS=" " read -r url fileName; do wget -O "$fileName" "$url"done < $tmpfile2rm $tmpfile $tmpfile2

I've had that problem for at least 16 years which was when I started using the internet.
Another common issue I have is that on a dodgy connection (think wi-fi with poor signal) firefox or chromium will never ever complete the download, but wget will go through just fine. On a dodgy connection firefox can even corrupt the file but again wget just works.
Another thing is how only now browsers are finally implementing pausable downloads but wget has had it since forever.

I don't think this is the case for wget. It's probably just not wasting cycles on shit like UI rendering. There is aria2c which allows for parallel downloads of files.

Also if OP is doing browsing at the same time then Firefox is probably slowing down the downloads to make the browsing more enjoyable. Firefox can't arbitrarily slow down wget even if he were to browse while wgetting.

Internet explorer 5

i wont use wget, far too much shit to type in just so i can download shit fast. the only way id recommend it is if that person loved to waist time and type a bunch of crap in terminal that could easily be done other ways

Make a alias for wget. Problem solved.

Make that alias something memorable and fast to type in, such as "wget".

If you want speed try aria2

Ok, partner

Please be quiet and don't give tips to (obviously) mentally challenged noobs.

github.com/rockdaboot/wget2

Typing "wget www.whatever.com/some_download.x" is really easy. Not sure why you think it's hard. Maybe you meant options? Just make a script if you want options, like downloadall.sh or something for recursive shit. Learn to script.

what am i doing wrong? --2018-01-25 20:42:26-- direct.emuparadise.me/roms/get-download.php?gid=168
Resolving direct.emuparadise.me... 111.90.159.152
Connecting to direct.emuparadise.me|111.90.159.152|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: /roms/roms.php?gid=168 [following]
--2018-01-25 20:42:26-- direct.emuparadise.me/roms/roms.php?gid=168
Reusing existing connection to direct.emuparadise.me:80.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: direct.emuparadise.me/roms/roms.php?gid=168 [following]
--2018-01-25 20:42:27-- direct.emuparadise.me/roms/roms.php?gid=168
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
Connecting to direct.emuparadise.me|111.90.159.152|:443... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: /Sega_Dreamcast_ISOs/Shenmue_(USA)/168 [following]
--2018-01-25 20:42:28-- direct.emuparadise.me/Sega_Dreamcast_ISOs/Shenmue_(USA)/168
Reusing existing connection to direct.emuparadise.me:443.
HTTP request sent, awaiting response... 200 OK
Length: 15810 (15K) [text/html]
Saving to: ‘get-download.php?gid=168’


get-download.php?gid=16 0%[ ] 0 --.-KB/s
get-download.php?gid=16 100%[===============================>] 15.44K --.-KB/s in 0.001s

2018-01-25 20:42:29 (18.9 MB/s) - ‘get-download.php?gid=168’ saved [66928]
i copied the download link and it gives me a php file. how do i make it work?

If you're going to shill you might as well tell why to use it over wget.

never mind, i found something in my repo that uses wget but has a UI with options. guess normal wget doesnt know how to deal with some links. its not very fast, only 300kbs...

I'm pretty sure emuparadise has a captcha, so you have to use javascript browser if you want to download from there. But there's tons of other rom sites, even archive.org has them.

nope, i said its working, just gotta pass captcha to get the link, its just going soo slow..like normal

...

projects.gnome.org/gwget/
thats what i used

so i found a better site but i get an error with wget and gwget. how do i make it work? $ wget doperoms.com/files/roms/sega_dreamcast/GETFILE_Shenmue v1.001 (2000)(Sega)(PAL)(M4)(Disc 1 of 4)[!].zip
bash: syntax error near unexpected token `('

wget is good

MTU

put quotes around the url

that did fix it but wget&gwget are now for the past couple weeks have given an error. i posted in the question post cause i actually did see this thread was still here...

like my balls im not flooding faggot im simply expanding my question so its more easy to answer