Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
i am logged into a website with Firefox. it has some kind of cookie based login method i cannot figure out. but i can get logged in with Firebox. it lets me download and save the files at various URLs. i have a list of hundreds of URLs i would like to download. doing it manually is very slow. running a script to do it outside of Firefox has been unsuccessful. i would like to know if there is a means in firefox to read this list (maybe as a "file:///" URL) and download each URL in that list to its own file somewhere locally. the site is accessed by "https://" so monitoring the web traffic is not going to get the data. any ideas that have been used and work?
wget would have do its own login since it would be making a new connection. but it can't get logged in. they might be using something non-standard. it does not have a "keep me logged in" feature so maybe it doesn't store any cookie. the URLs look the same on the console image from a friend who is using Windows (i don't recognize her browser), so the login state is not in the URL. it could be a lot of possibilities. but wget, lynx, and curl all fail to get logged in. there are connection drops as i periodically see the source port change. so it's making a number of requests on one connection then ends it and starts a new one and keeps going fast so it doesn't seem to be logging in again.
my guess is some JavaScript based login and some magic crypto/random number held in a cookie for the duration of the session. maybe so way of transferring the cookies from Firefox to wget would do the job.
Your question is pretty clear to me. There are tons of Firefox extensions that (claim to) do that. I'd playtest those before trying to do this with wget.
Figuring how to set up wget will be harder than you probably thought (hint: JWT-based auth systems don't necessarily involve cookies).
i was thinking of putting the list of URLs in a local HTML page and loading that. for a start, this might boost manual downloading. then if i can find some JS to download all the links (or adapt one that is close ... i've coded a few very basic JS based pages).
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.