How can I view all files in a website's directory?
Is it possible to list all files and directories in a given website's directory from the Linux shell?
Something similar to:
ls -l some_directorybut instead of some_directory, it would be ls -l . Obviously, the latter will not work.
4 Answers
I was just wondering the same thing. The following is probably not the most efficient solution, but it seems to work. It recreates the directory structure of the webserver locally. (Found the first command via stackoverflow)
wget --spider -r --no-parent
ls -l some.served.dir.ca 1 Yes, it is possible. Sometimes.
When you browse to a webpage (Say to ) it will open the file you specified (in this case `index.html).
If you do not specify a file and there is a default present (e.g. the web-server is configured to show index.html, or index.php, ...) and you typed then it will automagically present you with the right file.
If that file is not present then it can do other things, such as listing the directory contents. This is quite useful when building a site, however it is also considered unwise from a security standpoint.
TL;DR: Yes, sometimes it is possible.
However the more practical approach is to simply SSH, [s]FTP or RDP to the web-server and issue a local directory listing command.
2Without recursion
lftp -e "cls -1 > /tmp/list; exit" ""
cat /tmp/list I think that URL fuzzing is what you are looking for. Pentest tools offers an easy solution, but they do ask that you have the rights to search. Probably to reduce hacking. Here is an online solution.
Else download and install Kali Linux. Everyone thinks it is for hackers but if you are a professional website builder I think it will be good to have. Essentially, this question asks "how-to create" something like a sitemap, which most domains provide anyway.
Alternatively, try Arch Linux solutions.