Netstalking dorks

Mutt

Professional
Messages
1,057
Reputation
7
Reaction score
596
Points
113
What is it?
From some books and materials, you may already know that search engines have many special commands to refine queries. You can search not only by keywords, but also by fragments of site addresses, file extensions, etc. In addition, with the help of special characters, you can exclude or, on the contrary, be sure to include some words or whole phrases.

Dork is just such a request: sharpened for a specific topic, aimed.

Dorks are usually associated with "legal hacking": they allow using a search engine to access hidden sections of the site, for some reason "sticking out" without special protection. Yes, if you ever have your own web resource, cover it from such incidents. Usually, articles on dorks are devoted to this very thing - otherwise there would be no point in writing this post, but I want to cover dorks in a net-stalker context. Home hackers are leaking their client bases and are happy. For us, the compilation of a dork is determined by expediency.

There are no private doors / dorks. There are people who have passed their compilation better than you. Pull up to their level = get a "private" dork for free.

What does it look like in netstalking?
You can make dorks for both delisourch and non-random. That is: both to search for a specific object, and to gain access to a wide class of objects, from which you do not yet know what exactly to expect. An example of the first: searching for a book or a specific document (some kind of building permit, say). Example of the second: all Excel tables from sites on the .gov domain; all users of all Russian forums on the selected engine with specific interests in the profile.

A couple of case examples.
  1. (from hex break) Netstalkering in googledrives: inurl: "/ drive / folders /" site: drive.google.com
  2. (from CapyB) There is a cool dork for tor "(site: onion.link | site: onion.cab | site: tor2web.ch | site: tor2web.org | site: onion.sh | site: tor2web.fi | site: onion .direct | site: onion.gq | site: onion.top | site: onion.rip | site: onion.guide | site: onion.to | site: onion.gold) here the query "allows you to find a significant part of onion resources, but not blockable tor project like other crawlers
Also, dorks are the main way to search on the Internet of things (the same cameras) through search engines such as shodan.io, censys.io .

How to work with it?
Where to get dorks? The question is harmful, the correct one sounds like this: How can I learn to make my own dorks?

1. Take a list of search commands of several major search engines (the search engines themselves provide this information). Think back to a couple of your last difficult queries. Try to formulate them through these commands.

2. Consider the finished doorknobs. A number of analyzed examples are right in this article.

3. To compile a dork for a site or engine - see what the addresses of pages / file resources on this site / engine consist of. see Trello incident. This point is closely related to parsing pages. Those. the same thing you need to be able to get and download / automatically analyze all pages through a script in some Python realties.

4. Go from big to small. Take what you are looking for and define it in more and more detail.

For example. You need to find some ice cream from your childhood, about which only the red color of the label in English and the fact that it was red remained in your memory.
Code:
"red ice cream" - didn't work, many options
"red ice cream - babaevskoe" - again many options
"red ice cream - babaevskoe -" red October "" - again many options
"" ice cream of red color "1997 - babaevskoye -" red October ""
... etc. (The "-" sign is an exception, one of the special characters mentioned at the beginning)

Or, using an example with a more random search at a site address:

I am interested in the site aaa.com. You need to go over its php files, but not where the main content of the site lies. We look for php files like this:
Code:
site: aaa.com filetype: php

If there are many results, then we cut off those pages of which there are most, for example:
Code:
site: aaa.com filetype: php -index -news -search

This filters out everything that is generated based on the index.php, news.php, search.php pages. As a result, more hidden / hardly viewed copies remain in the search results.

This is the basis for compiling dorks. Difficult cases are already variations on its basis. Take a close look at the urls of your target site or sites, try different commands, and be sophisticated in highlighting the features of your searched object. Gradually, the skill of writing effective search queries will fall in your hand as conveniently as the body of a mouse.

f8d0db6a98f1b5f9a0ee0.png
 
Top