LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 10-29-2021, 05:39 PM   #1
Skaperen
Senior Member
 
Registered: May 2009
Location: center of singularity
Distribution: Xubuntu, Ubuntu, Slackware, Amazon Linux, OpenBSD, LFS (on Sparc_32 and i386)
Posts: 2,689
Blog Entries: 31

Rep: Reputation: 176Reputation: 176
how to download a list of URLs within Firefox


i am logged into a website with Firefox. it has some kind of cookie based login method i cannot figure out. but i can get logged in with Firebox. it lets me download and save the files at various URLs. i have a list of hundreds of URLs i would like to download. doing it manually is very slow. running a script to do it outside of Firefox has been unsuccessful. i would like to know if there is a means in firefox to read this list (maybe as a "file:///" URL) and download each URL in that list to its own file somewhere locally. the site is accessed by "https://" so monitoring the web traffic is not going to get the data. any ideas that have been used and work?
 
Old 10-29-2021, 07:43 PM   #2
frankbell
LQ Guru
 
Registered: Jan 2006
Location: Virginia, USA
Distribution: Slackware, Ubuntu MATE, Mageia, and whatever VMs I happen to be playing with
Posts: 19,384
Blog Entries: 28

Rep: Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164Reputation: 6164
If the issue is that you must be logged in in order to download, perhaps you could look at using wget or a similar program to fetch the downloads.

Here's an article that explains how to use wget to access a site with a username and password.

Last edited by frankbell; 10-29-2021 at 08:10 PM. Reason: clarity
 
Old 10-29-2021, 11:45 PM   #3
Skaperen
Senior Member
 
Registered: May 2009
Location: center of singularity
Distribution: Xubuntu, Ubuntu, Slackware, Amazon Linux, OpenBSD, LFS (on Sparc_32 and i386)
Posts: 2,689

Original Poster
Blog Entries: 31

Rep: Reputation: 176Reputation: 176
wget would have do its own login since it would be making a new connection. but it can't get logged in. they might be using something non-standard. it does not have a "keep me logged in" feature so maybe it doesn't store any cookie. the URLs look the same on the console image from a friend who is using Windows (i don't recognize her browser), so the login state is not in the URL. it could be a lot of possibilities. but wget, lynx, and curl all fail to get logged in. there are connection drops as i periodically see the source port change. so it's making a number of requests on one connection then ends it and starts a new one and keeps going fast so it doesn't seem to be logging in again.

my guess is some JavaScript based login and some magic crypto/random number held in a cookie for the duration of the session. maybe so way of transferring the cookies from Firefox to wget would do the job.
 
Old 10-30-2021, 01:02 AM   #4
dugan
LQ Guru
 
Registered: Nov 2003
Location: Canada
Distribution: distro hopper
Posts: 11,249

Rep: Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323Reputation: 5323
Your question is pretty clear to me. There are tons of Firefox extensions that (claim to) do that. I'd playtest those before trying to do this with wget.

Figuring how to set up wget will be harder than you probably thought (hint: JWT-based auth systems don't necessarily involve cookies).

Last edited by dugan; 10-30-2021 at 01:11 AM.
 
Old 10-30-2021, 01:17 AM   #5
Skaperen
Senior Member
 
Registered: May 2009
Location: center of singularity
Distribution: Xubuntu, Ubuntu, Slackware, Amazon Linux, OpenBSD, LFS (on Sparc_32 and i386)
Posts: 2,689

Original Poster
Blog Entries: 31

Rep: Reputation: 176Reputation: 176
i was thinking of putting the list of URLs in a local HTML page and loading that. for a start, this might boost manual downloading. then if i can find some JS to download all the links (or adapt one that is close ... i've coded a few very basic JS based pages).
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
question: 'onclick' within 'onmouseover' within 'form' within 'table' - how is it possible? rblampain Programming 4 04-25-2017 08:49 PM
[SOLVED] Loop through list of URLs in txt file, parse out parameters, pass to wget in bash. dchol Linux - Newbie 16 07-27-2011 02:19 PM
how to check urls and stop internet urls in network gface Linux - Networking 5 03-24-2005 09:48 PM
lynx : read a list of URLs from file ? fnd Linux - Software 0 06-22-2004 04:42 PM
XMMS download URLs down. Any url to download it? astitva Linux - Newbie 3 12-28-2003 05:16 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 07:50 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration